![]() MINIMUM INVASIVE SURGICAL SYSTEM
专利摘要:
method and apparatus for the control of manual gesticulation in a minimally invasive surgical system. the invention relates to a minimally invasive surgical system, in which a hand tracking system tracks the location of a sensor element mounted on part of a human hand. a system control parameter is generated based on the location of a part of the human hand. the operation of the minimally invasive surgical system is controlled using the system control parameter. thus, the minimally invasive surgical system includes a hand tracking system. a hand tracking system tracks the location of part of a human hand. a controller coupled to a hand tracking system converts the location to a system control parameter, and injects it into the system's surgical system. 公开号:BR112012011422B1 申请号:R112012011422-0 申请日:2010-11-11 公开日:2020-09-29 发明作者:Brandon D. Itkowitz;Simon Dimaio;Tao Zhao 申请人:Intuitive Surgical Operations, Inc; IPC主号:
专利说明:
Background Field of the Invention [0001] Aspects of the present invention are related to the control of minimally invasive surgical systems, and are more particularly related to the use of a surgeon's hand movement in the control of a minimally invasive surgical system. Related Technique [0002] Method and techniques for reading hand positions and gesticulation are known. For example, some video game controllers use hand tracking information. For example, the Nintendo Wii® gaming platform supports position remote controls and wireless guidance. (Wii is a registered trademark of Nintendo of America Inc, Redmond Washington, U.S.A.). The use of gesticulations and other physical movements such as swinging a stick or waving a magic wand provides the fundamental game element for that platform. The Sony Playstation Move has features similar to those of the Nintendo Wii® gaming platform. [0003] A CyberGlove® wireless motion data capture glove offered by CyberGlove systems includes eighteen data sensors with two fold sensors on each finger, four abduction sensors and sensors measuring thumb crossing, palm arch, flexion of the wrist, and abduction of the wrist. (CyberGlove® is a registered trademark of CyberGlove Systems LLC in San Jose, CA.). When a three-dimensional reading system is used with the CyberGlove® motion data capture glove, x, y, z, yaw, pitch, roll position, and hand guidance information are available. The motion capture system for the CyberGlove® motion data capture glove was used in digital prototype, virtual reality biomechanical, and animation evaluation. [0004] Another data glove with forty sensors is the ShapeHand data glove from Measurand Inc. A lightweight and portable HandCap Motion capture system from Measurand Inc. includes a system of flexible tapes that capture the movement of the index finger and the thumb along with the position and orientation of the hand and forearm in space. [0005] In In-Cheol Kim and Sung-Il Chien, "Analysis of 3D Hand Trajectory Gestures Using Stroke-Based Composite Hidden Markov Models (Hidden Markov Modeljs", Applied Intelligence, Vol.15 No. 2, p.131 - 143, September - October 2001, Kim and Chien explores the use of three-dimensional trajectory input with a Polhemus sensor for gesture recognition, Kim and Chien propose this form of entry due to the fact that three-dimensional trajectories offer greater discrimination power than two-dimensional gesticulations, which are predominantly used in video-based approaches. For their experiments, Kim and Chien make use of a Polhemus magnetic position tracking sensor attached to the back of a Fakespace PinchGlove. PinchGlove provides a means for the user signal the beginning and end of a gesture while the Polhemus sensor captures the three-dimensional trajectory of the user's hand. [0006] In Elena Sanchez-Nielsen, et al., "Hand Gesture Recognition for Human-Machine Interaction" Journal of WSCG, Vol. 12, No. 1-3, ISSN 1213-6972, WSCG'2004, February 2-6 20003, Plzen Czech Republic, a real time vision system is proposed for application in visual interaction environments through hand gesture recognition using general purpose hardware and low cost sensors, such as a personal computer and a web cam. In Pragati Garg, et al., "Vision Based Hand Gesture Recognition," 49 World Academy of Science, Engineering and Technology, 972-977 (2009), a video-based hand gesture recognition review was presented. One conclusion made was that most approaches are based on several underlying assumptions that may be appropriate in a controlled laboratory facility but do not generalize to an arbitrary facility. The authors determined "Computer vision methods for hand gesture interfaces must surpass current performance in terms of robustness and speed to achieve interactivity and usability" (Computer Vision methods for hand gesture interfaces must surpass current performance in terms of robustness and speed to achieve interactivity and usability). In the medical field, gesture recognition was considered for the sterile navigation of radiological images. See Juan P. Wachs, et al., "A Gesture-based Tool for Sterile Browsing of Radiology Images", Journal of the American Medical Informatics Association (2008; 15: 321-323, DOI 10.1197 / jamia.M24). summary [0007] In one aspect, a hand tracking system in a minimally invasive surgical system tracks the location of part of a human hand. A system control parameter for the minimally invasive surgical system is generated based on the location of the human hand part. The operation of the minimally invasive surgical system is controlled using the system control parameter. [0008] In one aspect, sensor elements mounted on part of the human hand are monitored to obtain locations of a part of the human hand. The position and orientation of a control point are generated based on the location. Teleoperation of a device in a minimally invasive surgical system is controlled based on the position and orientation control point. In one aspect, the device is a teleoperated slave surgical instrument. In another aspect, the device is a virtual proxy presented in a video image of a surgical field. Examples of a virtual proxy include a virtual slave surgical instrument, a virtual hand, and a virtual remote medical device. [0009] In an additional aspect, a grip closure parameter is generated in addition to the position and orientation of the control point. The gripping of an executing end of the teleoperated slave surgical instrument is controlled based on the gripping closure parameter. [00010] In another aspect, the system control parameter is a position and orientation of a control point used for teleoperating the slave surgical instrument. In yet another aspect, the system control parameter is determined from both hands. The system control parameter is a position and orientation from a control point to one of the two hands, and a position and orientation from a control point to the other of the two hands. The control points are used for teleoperating an endoscopic camera manipulator in the minimally invasive surgical system. [00011] In yet another aspect, sensing elements mounted on part of a second human hand are accompanied in addition to the sensing elements on the part of the human hand. A position and orientation for a second control point are generated based on the location of a part of the second human hand. In that regard, not only the control point but also the second control point are used in teleoperation control. [00012] In yet another aspect, sensor elements mounted on the fingers of the human hand are monitored. A movement between the fingers is determined, and the orientation of a teleoperated slave surgical instrument in a minimally invasive surgical system is controlled based on the movement. [00013] When the movement is a first movement, control includes rolling the end of a slave surgical instrument handle over its direction of indication. When the movement is a second different movement from the first movement, the control includes yaw movement of the slave surgical instrument handle. [00014] A minimally invasive surgical system includes a hand tracking system and a controller coupled to a hand tracking system. A hand tracking system tracks the locations of a plurality of sensor elements mounted on part of the human hand. The controller transforms the locations into a position and orientation from a control point. The controller sends a command to move a device in the minimally invasive surgical system based on the control point. Once again, in one aspect, the device is a teleoperated slave surgical instrument, while in another aspect, the device is a virtual proxy presented in a video image of a surgical field. [00015] In one aspect, the system also includes a main finger tracking device including the plurality of reading sensors. The main finger tracking device additionally includes a compressible body, a first finger loop attached to the compressible body, and a second finger loop attached to the compressible body. A first reading sensor in the plurality of reading sensors is attached to the first finger loop. A second reading sensor in the plurality of reading sensors is attached to the second finger loop. [00016] Thus, in one aspect, a minimally invasive surgical system includes a main finger tracking device. The main finger tracking device includes a compressible body, a first finger loop attached to the compressible body, and a second finger loop attached to the compressible body. A first reading sensor is attached to the first finger loop. A second reading sensor is attached to the second finger loop. [00017] The compressible body includes a first end, a second end, and an external outer surface. The outer outer surface includes a first portion that extends between the first and second ends, and a second portion, opposite and removed from the first portion, that extends between the first and second ends. [00018] The compressible body also has a length. The length is selected to limit the separation between a first finger and a second finger on the human hand. [00019] The first finger loop is attached to the compressible body adjacent to the first end and extends over the first portion of the outer outer surface. By placing the first finger loop on a first finger of the human hand, a first part of the first portion of the outer outer surface comes into contact with the first finger. [00020] The second finger loop is attached to the compressible body adjacent to the second end and extends over the first portion of the outer outer surface. By placing the second finger loop on a second finger of the human hand, a second part of the first portion of the outer outer surface comes into contact with the second finger. With the movement of the first and second fingers towards each other, the compressible body is positioned between the two fingers so that the compressible body provides resistance to movement. [00021] The thickness of the compressible body is selected so that with a tip of the first finger just touching a tip of the second finger, the compressible body is less than completely compressed. The compressible body is configured to provide haptic feedback corresponding to a gripping force of an executing end of a teleoperated slave surgical instrument. [00022] In one aspect, the first and second reading sensors are passive electromagnetic sensors. In an additional aspect, each passive electromagnetic reading sensor has six degrees of freedom. [00023] One method of using the primary finger tracking device includes tracking a first location of a sensor mounted on a first finger of the human hand and a second location of another sensor mounted on a second finger. Each location has N degrees of freedom, where N is an integer greater than zero. The first location and the second location are mapped to a control point location. The control point location has six degrees of freedom. The six degrees of freedom are less than or equal to 2 * N degrees of freedom. The first location and the second location are also mapped to a parameter having a single degree of freedom. The operation of a slave surgical instrument in a minimally invasive surgical system is controlled based on the location of the control point and the parameter. [00024] In a first aspect, the parameter is a grip closing distance. In a second aspect, the parameter comprises an orientation. In another aspect, N is six, while in a different aspect, N is five. [00025] In yet an additional aspect, sensor elements mounted on part of the human hand are monitored to obtain a plurality of locations of a part of the human hand. Hand gestures from a plurality of known hand gestures are selected based on the plurality of locations. The operation of a minimally invasive surgical system is controlled based on the gesture of the hand. [00026] The hand gesticulation can be any one of a hand gesticulation position, a hand gesticulation path, or a combination of a hand gesticulation position and a hand gesticulation path. When hand gesticulation is the hand gesticulation position and the plurality of known hand gesticulations includes a plurality of known hand gesticulation positions, a minimally invasive surgical system user interface is controlled based on the hand gesticulation position. . [00027] Additionally, in one aspect, when hand gesticulation is the hand gesticulation position, a hand gesticulation selection includes generating a set of characteristics observed from the plurality of monitored locations. The set of characteristics observed is compared with sets of characteristics from a plurality of known hand gesturing positions. One of the known hand gestures is selected as the hand gesturing position. The gesticulation position of the selected known hand is mapped to a system command, and the system command is triggered on the minimally invasive surgical system. [00028] In yet an additional aspect, when hand gesticulation includes the hand gesticulation path, the user interface of the minimally invasive surgical system is controlled based on the hand gesticulation path. [00029] In the minimally invasive surgical system with a hand tracking system and the controller, the controller turns the locations into a hand gesture. The controller sends a command to modify a minimally invasive surgical system's operating mode based on hand gesturing. [00030] In yet another aspect, a sensor element mounted on part of a human hand is monitored to obtain the location of a part of the human hand. Based on location, the method determines whether the position of the human hand part is within a threshold distance from the position of a main tool handle of a minimally invasive surgical system. Operation of the minimally invasive surgical system is controlled based on the result of the determination. In one aspect, the teleoperation of a teleoperated slave surgical instrument coupled to the main tool grip is controlled based on the result of the determination. In another aspect, the display of a user interface, or the display of a visual proxy is controlled based on the result of the determination. [00031] In one aspect, the position of a part of the human hand is specified by a checkpoint position. In another aspect, the position of a part of the human hand is an index finger position. [00032] A minimally invasive surgical system includes a hand tracking system. A hand tracking system tracks the location of part of a human hand. A controller uses location to determine if a surgeon's hand is close enough to a main tool handle to allow for a particular operation of the minimally invasive surgical system. [00033] A minimally invasive surgical system also includes a controller coupled to a hand tracking system. The controller converts the location to a system control parameter, and injects the minimally invasive surgical system with a command based on the system control parameter. Brief Description of Drawings [00034] Figure 1 is a high-level diagrammatic view of a minimally invasive teleoperated surgical system including a hand tracking system. [00035] Figures 2A to 2G are examples of several configurations of a main tool grip accompanied by the hand used to control a teleoperated slave surgical instrument from the minimally invasive teleoperated surgical system of figure 1. [00036] Figures 3A to 3D are examples of hand gesticulation positions used to control system modes in the minimally invasive teleoperated surgical system of figure 1. [00037] Figures 4A to 4C are examples of hand gesticulation trajectories that are also used to control system modes in the minimally invasive teleoperated surgical system of figure 1. [00038] Figure 5 is an illustration of the arrangement of fiducial markers for hand tracking in a camera based tracking system. [00039] Figures 6A and 6B are more detailed diagrams of the surgeon's console in figure 1, and include examples of coordinate systems used in hand tracking by the minimally invasive teleoperated surgical system in figure 1. [00040] Figure 7 is a more detailed illustration of a main tool grip used in the hand and the locations and coordinate systems used in hand tracking by the minimally invasive teleoperated surgical system in Figure 1. [00041] Figure 8 is a process flow diagram of a process used in the monitoring system to monitor the fingers and used to generate data for the teleoperation of a slave surgical instrument in the minimally invasive teleoperated surgical system of figure 1. [00042] Figure 9 is a more detailed process flow diagram of the MAP LOCATION DATA process for the CONTROL POINT AND PRESSURE PARAMETER in figure 8. [00043] Figure 10 is a process flow diagram of a process used in the tracking system to recognize hand gesticulation positions and hand gesticulation trajectories. [00044] Figure 11 is a process flow diagram of a process used in the monitoring system to detect the presence of the hand. [00045] Figure 12 is an illustration of an example of a main finger tracking device. [00046] Figure 13 is an illustration of a video image, presented on a screen device, including a visual proxy, which in this example includes a virtual spectral instrument, and a teleoperated slave surgical instrument. [00047] Figure 14 is an illustration of the video image, presented on a screen device, including visual proxies, which in this example includes a pair of virtual hands, and teleoperated slave surgical instruments. [00048] Figure 15 is a video image illustration, presented on a screen device, including visual proxy, which in this example includes a virtual remote medicine device and a virtual spectral instrument, and teleoperated slave surgical instruments. [00049] In the drawings, the first digit of a three-digit reference number indicates the figure number in which the element with that reference number first appeared and the first two digits of a four-digit reference number indicate the number of figure of the figure in which the element with that reference number first appeared. Detailed Description [00050] As used here, the location includes a position and an orientation. [00051] As used here, hand gesticulation, sometimes called a gesticulation, includes a hand gesticulation position, a hand gesticulation trajectory, and a combination of a hand gesticulation position and a gesticulation trajectory. hand. [00052] Aspects of the present invention increase the control capacity of minimally invasive surgical systems, for example, the da Vinci® minimally invasive teleoperated surgical system marketed by Intuitive Surgical, Inc. of Sunnyvale, California, using hand location information in control of the minimally invasive surgical system. The measured location of one or more fingers is used to determine a system control parameter, which in turn is used to trigger a system command in the surgical system. System commands depend on the location of the person whose hand location is being tracked, that is, whether the person is on a surgeon's console. [00053] When measured locations are for the fingers of a person's hand not on a surgeon's console, the system commands include a command to change the orientation of a part of a teleoperated slave surgical instrument based on the orientation combination hand and relative movement of two fingers of a hand, and a command to move a tip of a teleoperated slave surgical instrument so that the tip movement follows the movement of a part of the hand. When the measured locations are for a person's fingers on a surgeon's console, the system commands include commands that allow or prevent the movement of a slave surgical instrument from continuing to follow the movement of a main tool handle. When measured locations are either for fingers of a person's hand not on a surgeon's console, or for fingers of a person's hand on a surgeon's console, system commands include commanding the system, or a part of the system to perform an action based on the hand's gesticulation position, and command the system or part of the system to perform an action based on the hand's gesticulation path. [00054] Figure 1 is a high-level diagrammatic view of a minimally invasive teleoperated surgical system 100, for example, the da Vinci® Surgical System, including a hand tracking system. There are other parts, cables, etc. associated with the da Vinci® Surgical System, but these are not illustrated in figure 1 to avoid deviations from the description. In addition, information regarding minimally invasive surgical systems can be found, for example, in US patent application No. 11 / 762,165 (filed June 13, 2007, describing "Minimally Invasive Surgical System"), and US patent No. 6,331,181 ( issued December 18, 2001, describing "Surgical Robotic Tools, Data Architecture, And Use"), both of which are incorporated herein by reference. See also, for example, US Patent No. 7,155,315 (filed December 12, 2005; describing "Camera Referenced Control In A Minimally Invasive Surgical Apparatus") and 7,574,250 (filed February 4, 2003; describing "Image Shifting Apparatus And Method For A Telerobotic System "), both of which are incorporated herein by reference. [00055] In this example, system 100 includes a cart 110 with a plurality of handlers. Each manipulator and the teleoperated slave surgical instrument controlled by said manipulator can be coupled to and uncoupled from the main instrument manipulators on the surgeon's console 185, and additionally they can be coupled to and uncoupled from the main finger tracking handle not mechanically grounded and not triggered 170, sometimes called the main finger accompaniment grip 170. [00056] A stereoscopic endoscope 112 mounted on the manipulator 113 provides an image of the surgical field 103 within the patient 111 which is displayed on screen 187 and on the screen on the surgeon's console 185. The image includes images of any of the slave surgical devices in the field. stereoscopic endoscope view 112. The interactions between the main instrument handlers on the surgeon's console 185, the slave surgical devices and the stereoscopic endoscope 112 are the same as those in the known system and are therefore known to those skilled in the art. [00057] In one aspect, the surgeon 181 moves at least one finger of the surgeon's hand, which in turn causes a sensor in the main finger tracking grip 170 to change the location. The hand tracking transmitter 175 provides a field so that the new position and orientation of the finger are perceived by the main finger tracking grip 170. The new position and perceived orientation are provided to the hand tracking controller 130. [00058] In one aspect, as explained more fully below, the hand tracking controller 130 maps the perceived position and orientation to a control point position and a control point orientation in an eye coordinate system. surgeon 181. Hand tracking controller 130 sends said location information to system controller 140 which in turn sends a system command to the teleoperated slave surgical instrument coupled to the main finger tracking grip 170. As explained more fully below, using main finger tracking grip 170, surgeon 181 can control, for example, the gripping of an executing end of the teleoperated slave surgical instrument, as well as the roll and yaw of a fist attached to the executing end. [00059] In another aspect, hand tracking of at least part of the surgeon's hand 181 or the hand of the surgeon 180 is used by the hand tracking controller 130 to determine whether the hand's gesticulation position is performed by the surgeon, or a combination of the hand gesturing position and the hand gesturing path is performed by the surgeon. Each hand gesture position and each trajectory combined with the hand gesture position is mapped to a different system command. The system commands control, for example, changes in system mode and control other aspects of the minimally invasive surgical system 100. [00060] For example, instead of using pedals and keys as in the known minimally invasive surgical system, hand gesticulation, or hand gesticulation position or hand gesticulation trajectory, is used (i) to start moving between the movements of the main tool grip and the associated teleoperated slave surgical instrument, (ii) for activation of the main clutch (which decouples the main control of the slave instrument), (iii) for endoscopic camera control (which allows the main control the movement or characteristics of the endoscope, such as focus or electronic zoom), (iv) to exchange the robotic arm (which exchanges a particular main control between two slave instruments), and (v) to exchange TILEPRO ™, (which fixes the screen of the auxiliary video window on the surgeon's screen). (TILEPRO is a registered trademark of Intuitive Surgical, Inc. of Sunnyvale, CA, USA). [00061] When there are only two main tool handles in system 100 and surgeon 180 wants to control the movement of a slave surgical instrument other than the two teleoperated slave surgical instruments coupled to the two main tool handles, the surgeon can lock one or both. two slave surgical instruments teleoperated in place using a first hand gesture. The surgeon then associates one or both of the main tool handles with other slave surgical instruments held by the other of the manipulating arms when using a different hand gesture which in the present implementation provides an exchange of the main tool handle for another teleoperated slave surgical instrument. The surgeon 181 performs an equivalent procedure when there are only two main finger tracking wields in the 100 system. [00062] In yet another aspect, a hand tracking unit186 mounted on the surgeon's console 185 tracks at least part of the surgeon's hand 180 and sends the perceived location information to the hand tracking controller 130. The tracking controller handpiece 130 determines when the surgeon's hand is close enough to the main tool handle to allow the system to follow, for example, the movement of the slave surgical instrument following the movement of the main tool handle. As explained more fully below, in one aspect, the hand tracking controller 130 determines the position of the surgeon's hand and the position of the corresponding main tool handle. If the difference in the two positions is within a predetermined distance, for example, less than a threshold separation, monitoring is allowed, and otherwise monitoring is inhibited. Thus, distance is used as a measure of the presence of the surgeon's hand with respect to the main tool grip on the surgeon's console 185. In another aspect, when the position of the surgeon's hand with respect to the position of the main tool grip is less than the threshold separation, the display of a user interface on the screen is inhibited, for example, turned off on a screen device. Conversely, when the position of the surgeon's hand in relation to the position of the main tool handle is greater than the threshold separation, the user interface is displayed on the screen device, for example, on. [00063] Detecting the presence of the surgeon's hand has been a lasting problem. Presence detection has been experimented with many times using different contact perception technologies, such as capacitive switches, pressure sensors, and mechanical switches. However, these approaches are inherently problematic because surgeons have different preferences for how and where they want to hold the main tool handle. The use of distance as the presence measure is advantageous in that this type of presence detection allows the surgeon to touch the main tool grip slightly and then momentarily break physical contact to adjust the main tool grip, but does not determine how the surgeon holds the main tool with his fingers. [00064] Control of surgical instrument by means of hand monitoring [00065] An example of a mechanically ungrounded and unpowered main finger tracking grip 270, sometimes called the main finger tracking grip 270, is illustrated in figures 2A to 2D in different configurations which are more fully described below. Main finger tracking grip 270 includes sensors mounted on fingers 211,212, sometimes referred to as sensors mounted on fingers and thumb 211,212, which independently track the location (position and orientation in an example) of each of a 292B index finger tip and a 292A thumb tip, that is, follow the location of two fingers of the surgeon's hand. Thus, the location of the hand itself is tracked differently than tracking the location of the main tool handles in the known minimally invasive surgical system. [00066] In one aspect, the sensors provide six degrees of freedom (three translations and three rotations) for each finger on which a sensor is mounted. In another aspect, the sensors provide monitoring of five degrees of freedom (three translations and two rotations) for each finger on which a sensor is mounted. [00067] In yet another aspect, the sensors provide the monitoring of three degrees of freedom (three translations) for each finger of the hand on which a sensor is mounted. When two fingers are each accompanied by three degrees of freedom, the total of six translational degrees of freedom is sufficient to control a slave surgical instrument that does not include a wrist mechanism. [00068] A foam padded connector 210 is connected between finger-mounted sensors and the thumb 211,212. Connector 210 restricts the thumb 292A and the index finger 292B, that is, the fingers 291R, to be within a fixed distance, that is, there is a maximum separation distance between the fingers 291R over which the grip of main finger accompaniment 270 is mounted. As the thumb 292A and index finger 292B are moved from the maximum separation (figure 2A) to a completely closed configuration (figure 2D), the padding provides positive feedback to help surgeon 181 control the grip strength of an executing end of a teleoperated slave surgical instrument coupled to the main finger accompaniment grip 170. [00069] For the position illustrated in figure 2A with the thumb 292A and the index finger 292B separated by a maximum distance allowed by the main finger accompaniment grip 270, the gripping force is minimal. Conversely, in the position shown in figure 2D where the thumb 292A and the index finger 292 are as close as allowed by connector 210, that is, separated by the minimum distance allowed by the main finger tracking grip 270, the gripping force is the maxim. Figures 2B and 2C represent the positions that are mapped to intermediate the gripping forces. [00070] As explained more fully below, the locations (positions and orientations) of thumb 292A and index finger 292B in figures 2A to 2D are mapped to a grip closure parameter, for example, a normalized grip closure value that it is used to control the grip of the teleoperated slave surgical instrument coupled to the main finger tracking grip 270. Specifically, the perceived locations of thumb 292A and index finger 292B are mapped to the grip closure parameter by the hand tracking controller 130. [00071] Thus, the location of part of the hand of the surgeon 181 is monitored. Based on the tracked location, a minimally invasive surgical system system control parameter 100, that is, a grip closure parameter, is generated by hand tracking controller 130, and supplied to system controller 140. The controller of the hand System 140 uses the gripping closure parameter when generating a system command that is sent to the teleoperated slave surgical instrument. The system command instructs the teleoperated surgical instrument to configure an executing end to have a grip closure corresponding to the grip closure parameter. Therefore, the minimally invasive surgical system 100 uses the grip closure parameter to control the operation of the teleoperated slave surgical instrument of the minimally invasive surgical system 100. [00072] In addition, the locations (position and orientation) of thumb 292A and index finger 292B in figures 2A to 2D are mapped to a checkpoint position and a checkpoint orientation by the hand tracking controller 130. A checkpoint position and checkpoint orientation are mapped into an eye coordinate system for surgeon 181, and then provided to system controller 140 via a command signal. The control point position and control point orientation in the eye coordinate system are used by the system controller 140 for teleoperation of the slave surgical instrument coupled to the main finger tracking grip 170. [00073] Again, the location of the hand part of the surgeon 181 is monitored. Based on the tracked location, another system control parameter of the minimally invasive surgical system 100, that is, the control and orientation point position, is generated by the hand tracking controller 130. The hand tracking controller 130 transmits a command signal with control point position and orientation to system controller 140. System controller 140 uses position and guidance control point to generate a system command that is sent to the teleoperated slave surgical instrument. The system controller instructs the teleoperated surgical instrument to position the teleoperated surgical instrument based on the position and orientation control point. Therefore, the minimally invasive surgical system 100 uses the position control and orientation point to control the operation of the teleoperated slave surgical instrument of the minimally invasive surgical system 100. [00074] In addition to determining grip closure based on sensor positions 211,212, another relative movement between the index finger 292B and thumb 292A is used to control the yaw movement and the rolling movement of the slave surgical instrument. Rubbing together the finger 292B and thumb 292A as if rotating an axis, which is represented by the arrows in (figure 2E) on an imaginary axis 293, produces the roll of the tip of the slave surgical instrument, while the index finger slides along and the thumb up and down lengthwise next to each other, which is shown by the arrows on (figure 2F) an axis in the direction shown by the arrow 295, produces a yaw movement along the X axis of the slave surgical instrument . This is achieved by mapping the vector between the positions of the tip of the index finger and the tip of the thumb to define the orientation of the X axis of the control point. The position of the control point remains relatively stationary since the finger and thumb are each sliding symmetrically along the 295 axis. While the finger and thumb movements are not completely symmetrical, the position still remains sufficiently stationary that the user can easily correct any disturbance that may occur. [00075] Again, the locations of the hand part of the surgeon 181 are monitored. Based on the locations tracked, yet other system control parameters, that is, the relative movement between two fingers of the 291R surgeon's hand, are generated by the hand tracking controller 130. [00076] Hand tracking controller 130 converts relative motion into an orientation for the teleoperated slave surgical instrument coupled to main finger tracking grip 170. Hand tracking controller 130 sends a command signal with guidance to the controller system 140. While said orientation is an absolute orientation mapping, system controller 140, in one respect, uses said leveraged input during teleoperation in the same way as a guidance input from any other wielding tool main tool liability. An example of leverage is described in commonly assigned US Patent Application No. 12 / 495,213 (filed June 30, 2009, describing "Ratcheting For Master Alignment of A Teleoperated Surgical Instrument"), which is hereby incorporated by reference into its entirety. [00077] System controller 140 uses guidance to generate a system command that is sent to the teleoperated slave surgical instrument. The system controller instructs the teleoperated surgical instrument to rotate the teleoperated surgical instrument based on the orientation. Therefore, the minimally invasive surgical system 100 uses the movement between the two fingers to control the operation of the teleoperated slave surgical instrument of the minimally invasive surgical system 100. [00078] When the movement is a first movement, for example, crosswise rub the finger 292B and thumb 292A as if rotating an axis, the orientation is a roll, and the system command results in a roll of the tip of the wrist. slave surgical instrument next to its indication direction. When the movement is a second movement different from the first movement, for example, sliding your index finger and thumb up and down in length together (figure 2F), the orientation is a turn, and the control of the system results in the yaw movement of the slave surgical instrument handle. [00079] In yet another aspect, when the surgeon changes the mode of operation of the system to a gesture recognition mode, both hands are monitored and control points and orientations for both hands are generated based on the perceived positions and orientations of hand-mounted sensors in one aspect. For example, as illustrated in figure 2G, the tips of the thumb and index finger of each hand are in contact together to form a circular shape. The perceived position of each hand is mapped by the hand tracking controller 130 to a pair of control point positions. The control point pair is sent with a camera control system event to the system controller 140. [00080] Thus, in that aspect, the location of a part of each hand of the surgeon 181 is monitored. Another minimally invasive surgical system system control parameter 100, that is, the position pair of control points, based on the tracked location is generated by the hand tracking controller 130. The hand tracking controller 130 sends the pair of control point position with a camera control system event for system controller 140. [00081] In response to the camera control system event, system controller 140 generates a camera control system command based on the control point position pair. The camera control system command is sent to a teleoperated endoscopic camera manipulator in the minimally invasive surgical system 100. Therefore, the minimally invasive surgical system 100 uses the control point position pair to control the operation of the camera manipulator. endoscopic view of a minimally invasive surgical system 100. [00082] System control by means of hand gesture positions and hand gesture paths [00083] In said aspect, after being disposed of in a gesticulation detection operation mode, the hand tracking controller 130 detects the hand gesticulation position, or the hand gesticulation position and the hand gesticulation path. Controller 130 maps hand gesturing positions to a given system mode control command, and similarly controller 130 maps hand gesturing paths to another system mode control command. Note that the mapping of poses and trajectories are independent and so this is different from, for example, the monitoring of manual sign language. The ability to generate system commands and to control the system 100 using hand gesture positions and hand gesture paths, instead of manipulating keys, numerous pedals, etc. as in known minimally invasive surgical systems, it provides greater ease of use of the 100 system for the surgeon. [00084] When a surgeon is standing, using the hand gesturing positions and hand gesturing trajectories to control the system 100 makes it unnecessary for the surgeon to remove the patient's eyes and / or viewing screen and look for a pedal or wrench when the surgeon wants to change the system mode. Finally, the elimination of the various keys and pedals reduces the floor space required by the minimally invasive teleoperated surgical system. [00085] The particular set of hand gesticulation positions and hand gesticulation trajectories used to control the minimally invasive surgical system 100 is not essential as long as each hand gesticulation position and hand gesticulation trajectory are unambiguous. Specifically, a hand gesturing position should not be able to be interpreted by the hand tracking controller 130 as one or more other hand gesturing positions in the set of poses, and a hand gesturing path should not be interpreted as more than a hand gesturing trajectory in the set of trajectories. Thus, the hand gesticulation positions and hand gesticulation trajectories discussed below are illustrative only and are not intended to be limiting. [00086] Figures 3A to 3D are examples of hand gesticulation positions 300A to 300D, respectively. Figures 4A to 4C are examples of hand gesticulation trajectories. Note, for example, that the configuration in figure 2A looks similar to that in figure 3A, but the mode of operation of the minimally invasive surgical system 100 is different when the two configurations are used. [00087] In figure 2A, the teleoperated minimally invasive slave surgical instrument is coupled to the main finger tracking grip 170 and the system 100 is in the tracking mode, so that the movement of the teleoperated minimally invasive slave surgical instrument follows the accompanied movement of the surgeon's hand. In figures 3A to 3D and 4A to 4C, the surgeon places the system 100 in gesture recognition mode, and then makes one of the illustrated hand gesturing positions or hand gesturing trajectories. Hand gesturing positions and hand gesturing trajectories are used to control system modes and are not used in operation tracking mode. For example, the modes of systems controlled with hand gesturing positions are to enable, disable, and rotate between visual displays, to engage visual display, and to drag / erase remote medicine. [00088] In the gesticulation position of the hand 300A (figure 3A), the thumb 292A and the index finger 292 are separated beyond a main gear threshold, for example, the separation between the two fingers of the hand 291R is greater than 115 mm. The hand gesturing position 300B (figure 3B) with the index finger 292B extended and the curled thumb 292A is used to signal the hand tracking controller 130 that the surgeon is plotting a hand gesturing path (see figures 4A and 4B). The 300C hand gesticulation position (figure 3C) with the thumb 292A up and the curled index finger 292B is used to connect a user interface and to rotate between modes in a user interface. The 300D hand gesticulation position (3D figure) with the 292A thumb down and the curled index finger 292B is used to turn off the user interface. Other hand gesturing positions may include an "A-okay" hand gesturing position, L-shaped hand gesturing position, etc. [00089] Hand tracking controller 130, in one aspect, uses a multidimensional feature vector to recognize and identify the hand's gesticulation position. Initially, a plurality of hand gesturing positions are specified. Then, a feature set that includes a plurality of features is specified. The feature set is designed to uniquely identify each hand gesturing position in the plurality of poses. [00090] A hand gesticulation position recognition process is trained using a training database. The training database includes a plurality of examples of each hand gesture position. The plurality of examples includes feature vectors for the poses produced for a number of different people. A feature set is generated for each of the examples in a training database. These feature sets are used to train a Bayesian multidimensional classifier, as explained more fully below. [00091] When surgeon 180 wants to enter a hand gesticulation operation mode, the surgeon activates a key, for example, presses a pedal, and then performs the hand gesticulation position with at least one hand. Note that while this example requires a single pedal, it allows the elimination of other pedals on the foot console of the known minimally invasive surgical system and thus still has the advantages described above. The hand tracking unit186 sends signals that represent the perceived positions and orientations of the thumb and index finger of the surgeon's hand or hands to the hand tracking controller 130. [00092] Using the tracking data for the surgeon's fingers, the hand tracking controller 130 generates a set of observed features. The hand tracking controller 130 then uses the trained Bayseian multidimensional classifier and the Mahalanobis distance to determine the probability, that is, the possibility, that the set of characteristics observed is a set of characteristics of a hand gesturing position in the hand. plurality of poses. This is done for each of the hand gesturing positions in the plurality of poses. [00093] The hand gesturing position in the plurality of poses that is selected by the hand tracking controller 130 as the observed hand gesturing position is one having the shortest Mahalanobis distance if the Mahalanobis distance is less than the maximum distance from Mahalanobis in a training database for that hand gesturing position. The selected hand gesture position is mapped to a system event. Hand tracking controller 130 injects the system event into system controller 140. [00094] System controller 140 processes the system event and issues a system command. For example, if the hand gesture position 300C (figure 3C) is detected, system controller 140 sends a system command to screen controller 150 to turn on the user interface. In response, the display controller 150 runs at least a portion of the user interface module 155 on processor 151 to generate a user interface on the surgeon console screen 185. [00095] Thus, in that aspect, the minimally invasive surgical system 100 follows the location of part of the human hand. Based on the tracked location, a system control parameter is generated, for example, the gesticulation position of the hand is selected. The hand gesticulation position is used to control the user interface of the minimally invasive surgical system 100, for example, to display the user interface on the surgeon's 185 console screen. [00096] The user interface control is only illustrative and is not intended to be limiting. Hand gesticulation can be used to make any of the mode changes in the known minimally invasive surgical system, for example, main gear, camera control, camera focus, manipulator arm swap, etc. [00097] If a hand gesture position recognition process determines that the observed hand gesture position is the hand gesture position for a hand gesture path, a system event is not injected by the hand tracking controller hand 130 based on pose recognition. Instead, a hand gesticulation path recognition process is initiated. [00098] In this example, the hand gesticulation position 300B (figure 3B) is the pose used to generate the hand gesticulation path. Figures 4A and 4B are two-dimensional examples of hand gesturing trajectories 400A and 400B that are produced using the hand gesturing position 300B. Figure 4C shows other two-dimensional examples of hand gesticulation trajectories that can be used. [00099] In one aspect, a hand gesticulation trajectory recognition process uses a Hidden Markov Model (Hidden Markov Model). To generate the probability distributions for the Hidden Markov Model (Hidden Markov Model), a training database is required. Before obtaining the training database, a set of hand gesticulation trajectories is specified. In one aspect, the sixteen hand gesturing trajectories in figure 4C are selected. [000100] In one aspect, a number of test subjects are selected to produce each of the hand's gesticulation trajectories. In one example, each test subject performed each trajectory a predetermined number of times. The position and orientation data for each individual for each trajectory performed were saved in a training database. In one aspect, as explained more fully below, the training database was used to train a distinct left-right Hidden Markov Model using an interactive Baum-Welch method. [000101] When a surgeon makes a trajectory, the data is converted to an observation sequence O by the hand tracking controller 130. With observation sequence O and the Hidden Markov Model (Hidden Markov Model), the hand tracking 130 determines that the hand's gesticulation path corresponds to the observed symbol sequence. In one aspect, hand tracking controller 130 uses the forward recurrence algorithm with the Hidden Markov Models (Hidden Markov Model) to generate the total probability of a sequence of observed symbols. The gesticulation path of the hand with the highest probability is selected if that probability is greater than a threshold probability. If the highest probability is less than the threshold probability, no hand gesticulation trajectory is selected, and processing ends. [000102] The gesticulation trajectory of the selected hand is mapped to a system event. Hand tracking controller 130 injects the system event into system controller 140. [000103] System controller 140 processes the system event and issues a system command. For example, if the gesticulation trajectory of the selected hand is mapped to an event to change the lighting level in the surgical field, system controller 140 sends a system event to a controller in an illuminator to change the lighting level. Presence detection by hand tracking [000104] In one aspect, as indicated above, the hand positions of the surgeon 291R, 291L (figure 6A) are monitored to determine whether teleoperation of a minimally invasive surgical system 100 is allowed and in some respects a user interface is displayed to the surgeon. Again, the hand tracking controller 130 tracks at least part of the surgeon's hand 180B (figure 6A). Hand tracking controller 130 generates the location of a main tool handle, for example, the main tool handle 621 (figure 6B), which represents the main tool handles 621L, 621R (figure 6A), and the location of part of the hand. Hand tracking controller 130 maps the two locations in a common coordinate structure and then determines the distance between the two locations in the common coordinate structure. Distance is a system control parameter of a minimally invasive surgical system that is based on the location accompanied by a part of the surgeon's hand. [000105] If the distance is less than a safe threshold, that is, less than the maximum allowed separation between the hand part and the main tool handle, the teleoperation of a minimally invasive surgical system 100 is allowed, and on the other mode, teleoperation is inhibited. Similarly, in the aspect that uses presence detection to control the display of a user interface, if the distance is less than a safe threshold, that is, less than the maximum allowed separation between the hand part and the grip As a main tool, the display of a user interface on the screen of the minimally invasive surgical system 100 is inhibited, and otherwise the user interface screen is allowed. [000106] Thus, the distance is used in the teleoperation control of the minimally invasive surgical system 100. Specifically, the hand tracking controller 130 sends a system event to the system controller 140 indicating whether teleoperation is allowed. In response to the system event, system controller 140 configures system 100 to allow or inhibit teleoperation. Hand Location Tracking Technologies [000107] Before considering the various aspects of hand tracking described in further detail, an example of a tracking technology is described. Said example is only illustrative and in view of the description of the accompaniment, any accompanying technology that provides the necessary hand or finger location information can be used. [000108] In one aspect, pulsed DC electromagnetic tracking is used with sensors mounted on two fingers of one hand, for example, the thumb and index finger, as illustrated in figures 2A to 2D and in figure 7. Each sensor measures six degrees of freedom and in one aspect it has a size of eight millimeters by two millimeters by one and a half millimeters (8 mm x 2 mm x 1.5 mm). The tracking system has a right working area of 0.8 m in the hemisphere and a position reading resolution of 0.5 mm and 0.1 degrees. The update coefficient is 160 Hertz and has a reading latency of four milliseconds. When integrated into a system, additional latency can be incurred due to additional filtering and communication. An effective command latency of up to 30 milliseconds was found to be acceptable. [000109] In that regard, a tracking system includes an electromagnetic hand tracking controller, sensors for use in the main finger tracking grip, and a hand tracking transmitter. A tracking system suitable for use in an embodiment of the present invention is offered by Ascension Technology Corporation of Burlington, Vermont, USA as a 3D guidance trakSTAR ™ system with a Mid-Range transmitter. (trakSTAR ™ is a registered trademark of Ascension Technology Corporation.) The transmitter generates pulsed DC magnetic fields for high precision tracking over medium ranges, which is specified as 78 centimeters (31 inches). This system provides dynamic monitoring with 240 to 420 updates / second for each sensor. Emissions from miniaturized passive sensors are not affected by the noise sources in the power line. A clear line of sight between the transmitter and the sensors is not required. There is all attitude monitoring and no inertial deviation or optical interference. There is high metal immunity and no distortion from non-magnetic metals. [000110] Although an electromagnetic tracking system with finger covers is used here, this is only illustrative and is not intended to be limiting. For example, a pen-type device can be held by the surgeon. The pen-type device is a finger piece with three or more non-collinear fiducial markers on the outer surface of the device. Typically, to produce at least three fiducial markers visible at any point of view, more fiducial markers are used because of self-occlusion. The fiducial markers are sufficient to determine a movement of six degrees of freedom (three translations and three rotations) of the finger piece and thus that of the hand that is holding the pen type device. The pen-type device also perceives grip in one aspect. [000111] The pen-type device is viewed by two or more cameras of known parameters to locate the fiducial markers in three dimensions and to infer the three-dimensional pose of the finger piece. Fiducial markers can be implemented, for example, as 1) retroreflective spheres with lighting close to the camera; 2) concave or convex half spheres with lighting close to the camera; or 3) active markers such as an LED (flashing). In one aspect, near-infrared illumination of the finger piece is used, and filters are used to block the visible spectrum in the camera to minimize disturbance of background noise. [000112] In another aspect, data glove 501 (figure 5) or bare hand 502 is used, and fiducial markers 511 are attached to the thumb and index finger of glove 501 (and / or other fingers of the glove) that the surgeon will use and / or directly to the skin of the 502 hand. Again, redundant markers can be used to accommodate self-occlusion. Fiducial markers can also be arranged on the other fingers to allow for more user interface features through specifically defined hand gestures. [000113] The three-dimensional locations of the fiducial markers are computed by triangulation of multiple cameras having a common field of view. The three-dimensional locations of the fiducial markers are used to infer a three-dimensional pose (translation and orientation) of the hand and also the size of the grip. [000114] The locations of the marker need to be calibrated before use. For example, the surgeon can show the hand with markers in different poses for the camera. The different poses are then used in the calibration. [000115] In yet another aspect, hand tracking without a marker is used. The movement of the articulated hand can be monitored by using images seen from one or more cameras and processing the said images by running computer software. Computer execution software does not need to keep up with all degrees of freedom of the hand to be useful. The execution software only needs to accompany a part related to the two fingers of the hand to be useful for controlling a surgical instrument as demonstrated here. [000116] In camera-based monitoring, the accuracy of measurements depends on the accuracy of the location of the markers in the image; the precision of three-dimensional reconstruction due to the geometry of the camera; and redundant data such as more than a minimum number, for example, three, of fiducial markers, more than a minimum number (one or two) of cameras, and leveling and temporal filtering. [000117] The accuracy of three-dimensional reconstruction is strongly based on the accuracy of the camera calibration. Some fiducial markers attached to known locations on the surgeon's console can be used to determine the extrinsic parameters (rotations and translations) of multiple cameras relative to the surgeon's console. This process can be carried out automatically. Active fiducial markers can be used for the calibration fiducial markers since said markers are only switched on during a calibration process and before the procedure. During the procedure, the calibration fiducial markers are turned off to avoid confusion with the fiducial markers used to locate the surgeon's hands. The relative extrinsic parameters can also be computed when observing a moving marker in the common field of view of the cameras. [000118] Other tracking technologies that are suitable for use include, but are not limited to, inertial tracking, camera depth tracking, and fiber flex reading. [000119] As used here, a sensor element, sometimes called a reading sensor, can be a sensor for any of the hand tracking technologies described above, for example, a passive electromagnetic sensor, a fiducial marker, or a sensor for any of the other technologies. Coordinate systems [000120] Before considering the various processes described above in further detail, an example of a 185B surgeon console (figure 6A) is considered, and several coordinate systems are defined for use in the accompanying examples. The 185B surgeon console is an example of the 185B surgeon console. The 185B surgeon console includes a 610 three-dimensional viewer, sometimes referred to as a 610 viewer, 620L, 620R main instrument manipulators with 621L, 621R main tool handles, and a base 630. Main tool grip 621 (figure 6B) is a more detailed diagram of the main tool handles 621L, 621R. [000121] The main tool handles 621L, 621R of main instrument manipulators 620L, 620R are held by the surgeon 180B using the index finger and thumb, so that targeting and gripping involve intuitive pointing and pinching movements. The main instrument handlers 620L, 620R in combination with the main tool handles 621L, 621R are used to control teleoperated slave surgical instruments, teleoperated endoscopes, etc. in the same way as the known manipulators of the main instrument in the known minimally invasive teleoperated surgical system. In addition, the position coordinates of the main instrument handlers 620L, 620R and the main tool handles 621L, 621R are known from the kinematics used in the control of slave surgical instruments. [000122] In normal view operation mode, viewer 610 displays three-dimensional images of surgical field 103 from stereoscopic endoscope 112. Viewer 610 is positioned on console 185B (figure 6B) close to the surgeon's hands so that the image of the surgical field seen in viewer 610 is oriented so that the surgeon 180B realizes that he is actually looking directly at the surgical field 103. The surgical instruments in the image appear to be located substantially where the surgeon's hands are located and oriented substantially as the surgeon 180B would wait based on the position of your hands. However, the 180B surgeon cannot see either your hands, the position or the orientation of the main tool handles 621L, 621R, while viewing the displayed image of the surgical field on the viewer 610. [000123] In one aspect, the main instrument handlers 620L, 620R are moved from directly in front of the surgeon 180B and under the viewer 610 so that they are positioned on the base 630, and so that they do not they are more positioned under the viewer 610, that is, the manipulators of the main instrument are arranged out of the way of the hand gesture. This provides an unobstructed volume under the viewer 610 in which the surgeon 180B can perform hand gestures, or one or both hand gestures or hand gestures. [000124] In the aspect of figure 6A, three coordinate systems are defined with respect to the 185B surgeon's console: a view of the 660 coordinate system, a 670 world coordinate system, and a 650 follow-up coordinate system. Note which systems of equivalent coordinates are defined for the surgeon 181 (figure 1), so that the mapping described more fully below can be performed for reading data from main finger tracking grip 170 or from 621L main tool handles , 621R. See, for example, US patent application No. 12 / 617,937, entitled "Patient-Side Surgeon Interface For a Minimally Invasive Teleoperated Surgical Instrument," filed on November 13, 2009, which is hereby incorporated by reference in its wholeness. [000125] In the 660 vision coordinate system, the 180B surgeon is looking at the Z-visionZ axis. The Y-axis viewY points up on the screen. X-axis viewX points to the left on the screen. In the 670 world coordinate system, the Z-axis mundiaIZ is a vertical axis. The world X-axis Xworld and the world Y-axis worldY are on a plane perpendicular to the world Z-axis. [000126] Figure 6B is a more detailed illustration of main tool grip 621 and main instrument handlers 620. Coordinate systems 680, 690 are discussed more fully below with respect to method 1100 of figure 11. Control process of surgical instrument through hand monitoring [000127] Figure 7 is an illustration of sensor 212 mounted on the index finger 292B with the location 713 in the tracking coordinate system 750, and a sensor 211 mounted in the thumb 292A with the location 711 in the tracking coordinate system 750. The sensors 211 and 212 are part of the electromagnetic tracking system described above. The thumb 292A and index finger 292B are examples of fingers on the right hand 291R. As noted earlier, the part of the human hand includes at least one finger of the hand. As is known to those skilled in the art, the fingers, sometimes called fingers or phalanges, of the hand are the thumb (first finger), index finger (second finger; index finger), middle finger (third finger), ring finger ( fourth finger), and little finger (fifth finger). [000128] Here, the thumb and the index finger are used as examples of two fingers of the human hand. This is only illustrative and is not intended to be limiting. For example, the thumb and middle finger can be used in place of the thumb and index finger. The description here is directly applicable to the use of the middle finger yet. Still, the use of the right hand is only illustrative. When similar sensors are used in the left hand, the description here is directly applicable to the left hand yet. [000129] Cable 741,742 connects sensors 211, 212 from the main finger tracking handle 270 to the hand tracking controller 130. In one aspect, cable 741, 742 port position and guidance information from the sensors 211,212 to the tracking controller 130. [000130] The use of a cable to transmit the data of the perceived position and orientation to the hand tracking controller 130 is only illustrative and is not intended to be limiting to this specific aspect. In view of the present description, one skilled in the art can select a mechanism for transmitting the perceived position and orientation data from the leading finger tracking grip or leading finger tracking grip to the hand tracking controller 130 (e.g. using a wireless connection). [000131] The cable 741, 742 does not inhibit the movement of the main finger grip 270. Since the main finger grip grip 270 is mechanically ungrounded, each main finger tracking grip is in fact not restricted to both position and orientation movements within the workspace reachable by the surgeon and the workspace of the hand-held transmitter (for example, left - right, up - down, even in - out, roll, step , and shift in the Cartesian coordinate system). [000132] In one aspect, as described above, each sensor 211,212 in the main finger tracking grip 270 perceives three degrees of translation and three degrees of rotation, that is, six degrees of freedom. Thus, the data perceived from the two sensors represent twelve degrees of freedom. In another aspect, each sensor 211,212 in the main finger tracking grip 270 perceives three degrees of translation and two degrees of rotation (yaw and pitch), that is, five degrees of freedom. Thus, the data perceived from the two sensors represent ten degrees of freedom. [000133] When using a checkpoint position and a checkpoint orientation based on the locations tracked to control a teleoperated slave surgical instrument requires six degrees of freedom (three translations and three orientations), as described more fully below. Thus, in aspects where each sensor has five or six degrees of freedom, sensors 211, 212 provide redundant degrees of freedom. As described above and more completely below, the redundant degrees of freedom are mapped to the parameters used to control the teleoperated slave surgical instrument different aspects of position and orientation. [000134] In yet an additional aspect, each sensor 211, 212 perceives only three degrees of freedom and therefore represents six degrees of freedom. This is sufficient to control three degrees of translation, scrolling, and gripping of a slave surgical instrument that does not include a wrist mechanism. The tracking description is used to generate the location of the control point using the six degrees of freedom. The orientation of the control point is taken as the orientation of the slave surgical instrument. The grip closure parameter is determined as described below using the control point location and the control point orientation. The roll is determined as described above using the relative movement of the thumb and index finger. [000135] In any of the aspects where the sensors perceive six degrees of freedom, or where the sensors perceive five degrees of freedom, the index finger sensor 212 generates a signal representing a position of the indexing index finger and an orientation of the index finger Indicator in the accompanying coordinate structure 750. The thumb sensor 211 generates a signal representing a position of the thumb thumb and an orientation of the thumb Rpoiegar in the structure of accompaniment coordinates 750. In one aspect, the positions pindicator and ppoiegar are taken as aligned with the center of the user's nail on the index finger 292B and the center of the user's nail on the thumb 292A, respectively. [000136] In this example, the pindicator and ppoiegar positions are each re-represented as a three-by-one vector in the accompanying coordinate structure 750. The pindicator and ppoiegar positions are in the tracking coordinates. [000137] The Rindicador and Rpoiegar guidelines are each re-presented as a three-by-three matrix in the accompanying 750 coordinate structure, that is, [000138] A pcp control point position is centered between the 292B index finger and the 292A thumb. The Pcp control point position is in the 790 control point structure, but is specified in the tracking coordinates. The Z-axis of control point structure 790 extends through the position of the control point pcp in the direction of indication, as described more fully below. [000139] Also, as explained below, the index finger 292B and thumb 292A are mapped to the jaws of a slave surgical instrument, but both fingers are more skilled than the instrument's jaws. The Y-axis of the 790 control point structure corresponds to the pin used to close the instrument's jaw. Thus, the Y-axis of the control point structure 790 is perpendicular to a vector between the index finger 292B and the thumb 292A, as described below. [000140] The position of the pcp control point is redisplayed as a three by one vector in the tracking coordinates of the tracking coordinate structure 750. The orientation of the Rcp control point is redisplayed as a three by three matrix in the tracking coordinates, this is, [000141] Figure 8 is a process flow diagram for mapping the location of part of the hand to a grip closure parameter used to control the grip of a slave surgical instrument, for example, one of the teleoperated slave surgical instruments in the figure 1. The aforementioned mapping also maps a temporal change in the location to a new grip closure parameter and the corresponding location of an end of the slave instrument and the speed in moving to that location. [000142] Initially, when entering the process 800, the process of RECEIVING HAND LOCATION DATA 810, the position and orientation of the index finger (index, Rindicator) and the position and orientation of the thumb (ppoiegar, Rpoiegar) are received , which in this example are stored as 811 data. The position and orientation of the index finger (index, Rindicator) and the position and orientation of the thumb (ppoiegar, RPoiegar) are based on data from the tracking system. The 810 process transfers the LOCALIZATION DATA TO THE CONTROL POINT AND PRESSURE PARAMETER 820 to the process. [000143] The process MAPPING LOCATION DATA TO THE CONTROL POINT AND PRESSURE PARAMETER 820 generates a control point position pcp, an orientation of the control point Rcp, and a parameter of closing grasp using the position and orientation of the index finger (pindicator, Rindicator) and the position and orientation of the thumb (ppoiegar, Rpoiegar). The position of the control point pcp, the orientation of the control point Rcp, and the grasp close parameter are stored as data 821. [000144] In one aspect, the control point mapping performed in process 820 is defined to emulate key properties of the control point layout of the known main instrument handlers. Thus, the response to the movement of the thumb and index finger will be familiar and intuitive to users of the teleoperated minimally invasive surgical system known with a surgeon console similar to the 180B surgeon console (figure 6A). [000145] Figure 9 is a more detailed process flow diagram for an aspect of the process of MAPPING LOCATION DATA TO THE CONTROL POINT AND PRESSURE PARAMETER 820. First, in process 820, the process of MAPPING POSITION DATA FROM HAND TO CONTROL POINT 910 generates the location of the pcp control point position from the position of the indexing index finger and the position of the indexing thumb, ie, Pcp - 0.5 (index + index) [000146] The position of the pcp control point is the average position of the index finger and the position of the thumb. The process of MAPPING HAND POSITION DATA TO CONTROL POINT 910 transfers processing to the GENERATE CONTROL POINT 920 GUIDANCE process. [000147] As indicated above, the Z-axis of the control point orientation is aligned in the direction of indication. In that aspect of the process of GENERATING THE CONTROL POINT 920 ORIENTATION, the Rodriguez axis / angle formula is used to define the Z-axis indication direction vector zmetad for the orientation of the control point as the half rotations between the zdedojndjcador index finger direction vector and the thumb thumb direction vector. From Rpoiegar thumb orientation, the ipolegar thumb indication direction vector is: [000148] Similarly, from the orientation of the Indicator index finger, the direction indicator vector of the index finger z ˆ is: [000149] Vector ω is a vector perpendicular to the direction indicator vector of the zded0jndjcador index finger and the direction vector of the ipolegar thumb. The ω vector is defined as the cross product of the zded0inidicator index finger direction vector and the z thumb thumb direction vector, that is, [000150] The angle θ is the magnitude of the angle between the direction indicator vector of the index finger fded0indicator and the direction indication vector of the thumb. The angle θ is defined as, [000151] With the ω axis and angle 0, the direction vector of the θ-Z axis zmetade is: [000152] Thus, process 910 generated the Pcp control point position and the initial part of process 920 generated the approximate indication direction of the Z-axis in control point structure 790. You can proceed to interpolate the finger vectors index and thumb orientation to generate the axes of the xcp and ycp checkpoint unit vector in a similar way and then reortogonalize them to produce an orientation of the checkpoint matrix. [000153] However, greater teleoperation dexterity can be achieved from the locations accompanied by the fingers when using the tracking mapping. This mapping uses the relative positions of the index and thumb to effectively roll and yaw the control point as the small gimbal between the fingers is handled. The rest of process 920 is performed as follows to generate a complete set of xcp, ycp, and zcp orthogonal control point unit vector axes. [000154] With the said vectors, orientation of the Rcp control point is: [000155] Now with processes 910 and 920, process 820 mapped the positions and orientations of the index finger and thumb (pindicator, Rindicator), (Ppoiegar, Rpoiegar) to the position and orientation control point (pcp, Rcp) . The 820 process should also generate a parameter for the closing of the grip. Thus, the process of GENERATING GUIDANCE FROM CONTROL POINT 920 transfers the processing to the process of GENERATING CLOSURE PARAMETER OF PRESSURE 930. [000156] In process 930, the grip closure is determined by the distances from the position of the index finger and the position of the thumb from the center line axis defined by the position of the control point pcp and the direction of the z-axis zcp. This allows the grip closure to be invariant to slide when the thumb and index finger are touching. [000157] Thus, the position of the index finger and the position of the thumb are mapped on the Z-axis in the 790 structure. The Pindicador_proj position is the projection of the position of the index-finger on the indicator on the Z-axis of the 790 structure, and the pPoiegar_Proj position is the projection of the position of the thumb ppoiegar on the Z-axis of the 790 structure. [000158] The Pindicador_proJ POSITION θ 3 POSITION Ppolegar_proj ARE USED to evaluate an assessment hold closing distance dvai, that is, [000159] Here, the parallel double lines are the known representative of the two-norm Euclidean distance. The assessment hold closing distance dvai is bounded by the maximum distance threshold dmax and a minimum distance threshold dmin. As illustrated in figure 7, the foam-padded connector 210 between sensors 211,212 restricts the fingers to be within a separation, that is, between a maximum distance threshold dmax and a minimum distance threshold dmin. Additionally, a neutral distance from the corresponds to the separation distance when the two fingers are just touching. [000160] For a particular set of sensors and a connector, a maximum distance threshold dmax, a minimum distance threshold dmin, and the neutral distance of are empirically determined. In one respect, three different combinations of sensors and a connector are provided for small, medium, and large hands. Each combination has its own maximum distance threshold dmax, minimum distance threshold dmin, and neutral distance as the connector length 210 is different in each of the combinations. [000161] Process 930 compares the distance dvai to the minimum distance threshold dmin. If the comparison finds that the distance dvai is less than the minimum distance threshold dmin, the gripping closing distance d is adjusted to the minimum distance threshold dmin. Otherwise, process 930 compares the distance dv to the maximum distance threshold dmax. If the comparison finds that the distance dvai is greater than the maximum distance threshold dmax, the gripping closing distance d is adjusted to the maximum distance dmax threshold. Otherwise, the gripping closing distance d is adjusted to the distance dvai. [000162] The test performed on the distance dvai to determine the gripping closing distance d is summarized as: [000163] Then, in process 930, the parameter for closing the grasp is generated: [000164] Thus, the gripping closing distance d between maximum distance threshold dmax and the distance from is mapped to a value between zero and one. The hold closing distance d between the minimum distance threshold dmin and the distance from is mapped to a value between minus one and zero. [000165] A value of one for the grasp closure parameter is obtained when the index finger 292B and thumb 292A are separated to the maximum extent allowed by connector 210 (figure 2A). A value of zero for the grasp closure parameter is obtained when the tip of the index finger 292B and the thumb tip 292A are just touching (figure 2C). Values in the range between zero and one, control the opening / closing of the jaws of the executing end of a slave surgical instrument. A value of minus one for the grip closure parameter gPrehension is obtained when the index finger 292B and thumb 292A are touching and connector 210 is completely compressed between index finger 292B and thumb 292A (figure 2D). Values in the range between zero and minus one control the jaw strength of the closed jaws of the executing end. Connector 210 provides passive haptic instruction for closing the jaw. [000166] The said example of mapping the grip closure distance d to a value in one of the two ranges is only illustrative and is not intended to be limiting. The example is illustrative of the gripping closure distance mapping of a value in a first grasp closing parameter range gPre grip to control the opening / closing of the jaws of an executing end of a slave surgical instrument when the gripping closure distance d is greater than the neutral distance of. Here "opening / closing" means opening and closing the jaws. The gripping close distance d is mapped to a value in a second range of the gripping close parameter to control the jaw strength of the closed jaws of the executing end when the gripping close distance d is less than the neutral distance of. [000167] Thus, process 820 mapped the position and orientation of the index finger (index, Rindicator) and the position and orientation of the thumb (pPoiegar, RPoiegar) at a position and orientation control point (pcp, RcP) and the gpreprehension closing parameter that is stored as the 821 data. The 820 process transfers to the MAPPING FOR THE WORLD COORDINATES 830 process (figure 8). [000168] The MAPPING FOR WORLD COORDINATES 830 process receives data 821, and maps the data 821 to a world coordinate system, (see world coordinate system 670 (figure 6A). Specifically, the position control point and orientation (pCp, Rcp) and the gpreprehension closure parameter are mapped to the position control and orientation point of the world coordinates (pcp_wc, RcP_wc) using a homogeneous four-by-four wcTtc transform that maps the coordinates in the system tracking coordinate 750 (figure 7B) to coordinates in the world coordinate system 670, for example, where wcRtc maps an orientation in the tc tracking coordinates to an orientation in the world wc coordinates, and wcttc translates the position in the tc tracking coordinates to a position in the world wc coordinates. [000169] The parameter for the closing of the grasp is not changed by said mapping. The data in the world wc coordinates are stored as 831 data. The 830 process transfers to the EYE COORDINATE MAPPING process 840. [000170] The EYE COORDINATE MAPPING Process 840 receives the data 831 in the world wc coordinates and maps the data 831 to an eye coordinate system (see 660 eye coordinate system (figure 6A). Specifically, the control point of position and orientation of the world coordinates (pCp_wc, Rcp_wc) and the gpreprehension closing parameter are mapped to the position control and orientation points of the eye coordinates (pcp_ec, Rcp_ec) using the homogeneous four-by-four transform ecTwc which maps the coordinates in the world coordinate system 670 (figure 6A) to the coordinates in the eye coordinate system 660, for example, where ecRwc maps an orientation in the world coordinates wc to an orientation in the coordinates of the eye c, and ectV7C is a translation of a position in the world coordinates wc to the position in the coordinates of the eye c. [000171] Once again, the gprehension closure parameter is not changed by the mapping. The data in the eye coordinates are stored as 841 data. The 840 process is transferred to the GENERATE SPEED 850 process. [000172] In process 800, mapping processes 830 and 840 are described as two different processes just for ease of illustration. In one aspect, the mapping processes 830 and 840 are combined so that the control point data in the tc tracking coordinates are mapped directly to the data in the eye and c coordinates using the homogeneous four-by-four ecTtc transform that maps the coordinates in the tracking coordinate system 650 (figure 6A) for the coordinates in the eye coordinate system 660, for example, [000173] In said aspect, the position of the control point in the pcp_ec eye coordinates is: and the orientation of the control point in the Rcp_ec eye coordinates is: [000174] In some ways, the world coordinate mapping can be eliminated. In this case, the control point data is mapped directly from the tracking coordinate system within the eye coordinate system without using a world coordinate system. [000175] For teleoperation, position, orientation, and speed are necessary. Thus, the GENERATE SPEED 850 process generates the necessary speed. Speeds can be generated in a number of ways. Some implementations, such as inertial and gyroscope sensors, can directly measure differential signals to produce a linear speed and an angular speed of the control point. If speeds cannot be directly measured, process 850 estimates speeds from location measurements in the eye coordinate system in one aspect. [000176] Speeds can be estimated using finite differences in the eye coordinate system over the sampling interval. For example, the linear speed vCp_ec is estimated as: and the angular velocity ωcp_ec is estimated as: [000177] In another aspect of the GENERATE SPEED 850 process, the linear speed control point vcp_tc and the angular speed control point ωcp_tc are perceived in the tracking coordinates of the tracking coordinate system 750 (figure 7). In that regard, the linear velocity of the directly perceived control point vcp_tc and the angular velocity of the directly perceived control point ωcp_tc are rotated from the accompanying coordinate system 750 to the eye coordinate system 660 using ecRtc rotation. Specifically, using the rotation mapping as defined above, [000178] The GENERATE SPEED 850 process transfers to the SEND CONTROL COMMAND 860 process. The 860 process sends an appropriate system control command to the slave surgical instrument based on the stored position, orientation, speeds, and grip closure parameter as 851 data. [000179] In one aspect, processes 810 to 850 are performed by hand tracking controller 130 (figure 1). Controller 130 executes finger tracking module 135 on processor 131 to perform processes 810 to 850. In that regard, finger tracking module 135 is stored in memory 132. Process 850 sends a system event to the controller of the system 140 which in turn performs process 860. [000180] It should be noted that the hand tracking controller 130 and the system controller 140 can be implemented in practice by any combination of hardware, software that runs on a processor, and firmware. In addition, the functions of said controllers, as described here, can be performed by a unit, or divided between different components, each of which can be implemented in turn by any combination of hardware, software that runs on a processor, and firmware. When divided between different components, the components can be centralized in one location or distributed through system 100 for distributed processing purposes. Hand gesticulation pose process and gesticulation trajectory control [000181] Figure 10 is a process flow diagram of an aspect of a hand gesture position 1000 process and hand gesture path control system 100. In one aspect as described above, a process of recognition of hand hand gesticulation position 1050 uses a Bayesian multidimensional classifier and a 1060 hand gesticulation trajectory recognition process uses a separate Hidden Markov Model. [000182] As described above, figures 3A to 3D are examples of hand gesturing positions. To train the 1050 hand gesture position recognition process, a number of hand gesture positions are specified. The number of hand gesturing positions used is limited by the ability to define unique poses that can be unambiguously identified by a 1050 recognition process, and by the surgeon's ability to remember and reliably reproduce each of the different hand gestures. [000183] In addition to defining the hand gesturing positions, a feature set including a plurality of features fi, where i varies from 1 to n, is defined. The number n is the number of characteristics used. The number of and the type of features is selected so that each of the hand gesturing positions in the set of allowable poses can be precisely identified. In one respect, the number n is six. [000184] The accompaniment is an example of a set of characteristics with n characteristics. [000185] The fi feature is the product of the 292B index finger indicating direction point and the 292A thumb indicating direction. The Í2 feature is the distance between the index finger 292B and the thumb 292A. Feature Í3 is the thumb distance 292A projected in the direction of indicating the index finger 292B. The Í4 feature is the thumb distance 292A from the axis along the direction of indication of the index finger 292B. The f5 feature is the Z component of the 292A thumb-pointing direction. The fn characteristic is the product of the normal vector thumb point x thumb thumb 292A and the direction of indication zjndjcador of index finger 292B. [000186] Before using the 1000 method, it is necessary to develop the database for training hand gesticulation positions. A number of different users produce each hand gesture position at least once, and the position and orientation given for each hand gesture position for each user are measured using the tracking system. For example, each person in a group of people does each of the permissible hand gestures. The positions and orientations of the index finger and the thumb (Pindicator, Rindicator), (Ppolegar, Rpolegar) are SAVED for each of the hand gesturing positions for each person in the group in a training database. [000187] When using the training database, a set of characteristics {fs} is generated for each hand gesture position for each user. The set of characteristic training vectors for each hand gesturing position is then used to compute a means f and the ZA covariance. [000188] Thus, the training database is used to obtain a means of characteristic and covariance vector for each trained gesture. Additionally, for each hand gesture position, the Mahalanobis d (fi) distance (see discussion below) is generated for each trainer and the maximum Mahalanobis d (fi) distance for each hand gesture position is saved as a threshold to that hand gesturing position. [000189] One can also use the measured Mahalanobis distance to verify that all trained gestures are sufficiently different and unambiguous for the particular set of characteristics used. This can be accomplished by testing the Mahalanobis distance of a particular gesticulation feature vector medium f, and a feature vector medium of all other permissible gesticulation poses. The said test distance must be much greater than the maximum training distance threshold used for that particular gesticulation. [000190] As is known to those skilled in the art, the specification of a Hidden Markov Model requires the specification of two parameters of model N, M and three probability measurements A, B, TT.Markov Hidden Model A (Hidden Markov Model) is re-presented as: A = (A, B, π) [000191] Model parameter N is the number of states in the model, and model parameter M is the number of observation symbols per state. The three probability measurements are transition state probability distribution A, gesticulation observation probability distribution B, and initial state distribution π. [000192] In one aspect for a separate Hidden Markov Model, the transition probability distribution A is an N x N matrix. The gesticulation observation probability distribution B is an N x M matrix, and the initial state distribution π is an N x 1 matrix. [000193] Considering an observation sequence O and the Hidden Markov Model (Hidden Markov Model), the probability of observation sequence O considering the Hidden Markov Models A (Hidden Markov Model), that is, P (O | A ) is evaluated in process 1000, as described more fully below. [000194] To generate the probability distributions for the Hidden Markov Model (Hidden Markov Model), the training database is required. Before obtaining the training database, a set of hand gesticulation trajectories is specified. [000195] A number of test subjects j are selected to produce each of the hand gesticulation trajectories. While in figure 4C, the sixteen hand gesticulation trajectories are presented in a projected two-dimensional form, the test subjects are unlimited when they perform the various hand gesticulation trajectories, which allow some three-dimensional variations to arise. In one aspect, each individual performed each hand gesticulation trajectory k times, this produces j * k training sequences per hand gesticulation trajectory. [000196] In one aspect, the distinct left-right Hidden Markov Model (Hidden Markov Model) was used. Hidden Markov A models were chosen so that the probability P (O | A) is locally maximized using an interactive Baum-Welch method. See, for example, Lawrence R. Rabiner, "A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition," Proceedings of the IEEE, Vol. 77, No. 2, pp. 257-286 (Feb. 1989), which is incorporated herein by reference as a demonstration of knowledge of Hidden Markov Models of those versed in the models. In one respect, the iterative method was interrupted when the model converged within 0.1 percent for three successive iterations. [000197] The initial state probability π has been adjusted so that the model always starts with state one. The transition probability matrix A was initialized with random entries, which were stored in descending order on a row by row basis. To reinforce the structure from left to right, all entries on the lower diagonal of the transition probability matrix A were set to zero. In addition, transitions greater than two states have been disabled by setting the inputs to zero where (i - j)> 2 for all rows i and columns j. The transition probability matrix A was normalized at the end of the row on a row-by-row basis. [000198] Initialization for observation probability matrix B divided the observation sequence evenly based on the number of desired states. Therefore, each state can initially observe one or more symbols with the probability based on the local frequency count. Said matrix was also normalized on a row by row basis. See, for example, N. Liu, R.I.A. Davis, BC Lovell, PJ Kootsookos, "Effect of Initial HMM Choices in Multiple Sequence Training for Gesture Recognition," International Conference on Information Technology, 5-7, Las Vegas, pages 608 - 613 (April 2004), which is incorporated herein by reference as a demonstration of initialization procedures for Hidden Markov Models are known to those skilled in the art. The Hidden Markov Model (Hidden Markov Model) were developed for each of the hand gesticulation trajectories. [000199] Returning to method 1000, check process of ENABLED GESTICULATION MODE 1001 determines whether the surgeon has enabled the 100 system operation gesture recognition mode. In one aspect, to enable the gesture recognition mode, the surgeon presses a pedal on the surgeon's console 185 (figure 1A). If gesture recognition mode is enabled, checking process 1001 transfers it to the HAND LOCATION RECEIPT DATA process 1010, and otherwise returns via RETURN 1002. [000200] HAND LOCATION RECEIPT DATA 1010 receives the position and orientation of the index finger (Pindicator; Indicator) θ the POSITION and the Thumb Orientation (Ppolegar, Rpoiegar) for the gesticulation being produced by the surgeon. As noted above, the position and orientation of the index finger (pindicator, Rindicator) θ a POSITION and 3 Thumb orientation (Ppolegar, Rpoiegar) are based on data from the tracking system. Process 1010 transfers to PROCESS TO GENERATE CHARACTERISTICS 1011. [000201] In the PROCESS OF GENERATING CHARACTERISTICS 1011, the position and orientation of the index finger (index, indicator) and the position and orientation of the thumb (ppoiegar, Rpoiegar) are used to generate each of the characteristics fi_o a fn_o in a vector of observed feature fi_o. The process of GENERATING CHARACTERISTICS 1011 transfers to the process of COMPARING CHARACTERISTICS WITH KNOWN POSES 1012. [000202] The process of COMPARING CHARACTERISTICS WITH KNOWN POSES 1012 compares the observed characteristic vector fi_o with the set of trained characteristics {f J for each pose. Said process determines the probability that the observed characteristic vector is included within a set of training data set characteristics {fs} for a particular hand gesture position, that is, it corresponds to the training data set. This can be expressed as P (fi_o | Ω) where the set of characteristics of the training data set {f,} is from the object class Ω. [000203] In this example, the probability P (fi_o | Ω) is: where N is the dimensionality of the characteristic vector, for example, n in the example above. [000204] A statistic used to characterize this probability is the Mahalanobis d distance (fj_o), which is defined as: where fj o = fj —fj. The Mahalanobis distance is known to those skilled in the art. See, for example, Moghadam, Baback and Pentland, Alex, "Probabilistic Visual Learning for Object Representation," IEEE Transactions On Pattern Analysis and Machine Intelligence, Vol. 19, No. 7, pp. 696 to 710 (July 1997), which is incorporated herein by reference. [000205] When using the characteristic vector O and characteristic value A of covariance Zn, Eg is used in the diagonalized form so that the Mahalanobis distance d (fi_o) is: where y = <DTfLo. The diagonalized shape allows the Mahalanobis d (fi_o) distance to be expressed in terms of the sum: [000206] In this example, this is the expression that is evaluated to determine the distance from Mahalanobis d (fi_o). Thus, process 1011 generates the Mahalanobis d (fi_o) distance. Upon completion, process 1012 transfers to the SELECT POSE 1013 process. [000207] In the SELECT POSE 1013 process, the gesticulation position of the hand having the shortest Mahalanobis d (fj_o) distance is selected if the Mahalanobis d (fj_o) distance is less than the maximum Mahalanobis distance in a database training for that hand gesturing position. If Mahalanobis distance d (fi_o) is greater than the maximum Mahalanobis distance in a training database for that hand gesture position, no hand gesture position is selected. The SELECT POSE 1012 process transfers to TEMPORAL FILTER PROCESS 1014. [000208] TEMPORAL FILTER PROCESS 1014 determines whether the result of process 1013 provided the same result consecutively a predetermined number of times. If the 1013 process provided the same result for the predetermined number of times, the TEMPORAL FILTER process 1014 transfers it to the GESTICULATION POSE check process 1015, and otherwise returns. The predetermined number of times is selected so that the TEMPORAL FILTER 1014 process avoids transient oscillations or detections when switching between hand gesturing positions. [000209] The process of checking GESTICULATION POSE 1015 determines whether the selected hand gesticulation position is the hand gesticulation position used in the hand gesticulation path. If the selected hand gesticulation position is the hand gesticulation position used in the hand gesticulation path, the process for checking GESTICULATION POSE 1015 transfers processing to the GENERATE SPEED SEQUENCE 1020 process, and otherwise transfers the processing for the CHANGE POSE 1016 process. [000210] The CHANGE OF POSE CHANGE 1016 process determines whether the hand gesture position has changed since the last pass through method 1000. If the selected hand gesture position is the same as the immediately previous one results in a POSE FROM TEMPORAL FILTERED GESTICULATION, the process of checking POSE CHANGE 1016 returns via RETURN 1003, and otherwise transfers it to the MAPPING FOR SYSTEM 1030 EVENT process. [000211] MAPPING FOR SYSTEM EVENT 1030 process maps the selected hand gesture position to a system event, for example, the system event assigned to the hand gesture position is observed. Upon finding the system event, the MAPPING FOR SYSTEM EVENT 1030 process transfers processing to the INJECT SYSTEM EVENT 1031 process. [000212] In one aspect, the INJECT SYSTEM EVENT 1031 process sends the system event to an event manager on system controller 140 (figure 1). In response to the system event, the system controller 140 sends an appropriate system command to the controllers and / or other device in system 100. For example, if the hand gesture position is assigned to a user interface turn on event, system controller 140 sends a command to screen controller 150 to turn on the user interface. The display controller 150 performs the user interface module part 155 on the processor 150 required to power the user interface. [000213] When the hand gesticulation position is the hand gesticulation position used in the production of the trajectory, the processing in method 1000 transfers from the process of checking GESTICULATION POSE 1015 to the process of GENERATING SPEED SEQUENCE 1020. In one aspect, the main feature used to recognize the hand's gesticulation trajectory is a velocity unit vector. The velocity unit vector is invariant for starting the gesticulation position. Additionally, a normalized velocity vector is responsible for variations in size or gesticulation speed. Thus, in process 1020, samples from the control point are converted into a normalized control point velocity sequence, that is, into a sequence of velocity unit vectors. [000214] Upon completion of the GENERATE SPEED SEQUENCE 1020 process, process 1020 transfers the processing to the CONVERT SPEED SEQUENCE TO SYMBOL SEQUENCE 1021 process. As noted above, the distinct Markov Hidden Model A (Hidden Markov Model) requires a sequence of distinct symbols as input. In process 1021, distinct symbols are generated from the normalized control point speed sequence through vector quantization. [000215] In one aspect, vector quantization was performed using a modified K-half aggregation with the proviso that the process stops when aggregation determinations stop changing. While K-half aggregation is used, the process leverages the fact that characteristics are unit vectors. In this case, the vectors, which are similar in direction, are aggregated. This is accomplished using the dot product between each unit characteristic vector and the normalized center of aggregation vectors as the similarity metric. [000216] The aggregation is initialized with random determinations of vectors to thirty-two groups and the general process is iterated multiple times, where the best aggregation result is selected based on the maximum total "within" group cost metric. Note that in this case, the cost of the group "inside" is based on the similarity measure. Each of the resulting groups is assigned with a single indicator, which serves as a symbol for the Hidden Markov Model. An input vector is then mapped to its nearest group medium and the corresponding indicator for that group is used as the symbol. In this way, a sequence of velocity unit vectors can be translated into a sequence of indices or symbols. [000217] In one aspect, the aggregate vectors were assigned a symbol based on the two-dimensional vector quantization codebook of eight fixed directions. Thus, process 1020 generates a sequence of symbols observed and transfers it to the process of GENERATING PROBABILITY OF GESTICULATION 1023. [000218] In one aspect, to determine that the gesticulation corresponds to the sequence of symbols observed, the process of GENERATING PROBABILITY 1023 uses the forward recurrence algorithm with the Hidden Markov Model to find the probability that each gesture corresponds to the sequence of symbols observed. The forward recurrence algorithm is described in Rainer, "A Tutorial on Models Hidden Markov Models and Selected Applications in Speech Recognition," which was previously incorporated by reference. With the conclusion of the process of GENERATING PROBABILITY OF GESTICULATION 1023, the processing is transferred to the process of SELECTING PATH 1024. [000219] In the process of SELECTING TRAJECTORY 1024, the hand gesticulation trajectory with the highest probability from among the permissible Hidden Markov Model trajectory gesticulation models. This probability must also be greater than a certain threshold to be accepted. If the highest probability is not greater than the threshold, no hand gesture path is selected. This threshold should be tuned to maximize recognition accuracy while avoiding false recognition. [000220] Upon completion, the SELECT PATH 1024 process transfers to processing for the PATH CHECK 1025 process. If the SELECT PATH 1024 process selected a hand gesturing path, the CHECK PATH 1025 process transfers to processing for the MAPPING FOR SYSTEM 1030 EVENT process, and otherwise returns via RETURN 1004. [000221] The MAPPING FOR SYSTEM EVENT 1030 process maps the gesticulation path of the selected hand to a system event, for example, the system event assigned to the hand gesticulation path is observed. Upon finding the system event, the MAPPING FOR SYSTEM EVENT 1030 process transfers processing to the INJECT SYSTEM EVENT 1031 process. [000222] In one aspect, the INJECT SYSTEM EVENT 1031 process sends the system event to the event manager on system controller 140 (figure 1). In response to the system event, the system controller 140 sends an appropriate system command to the appropriate controller (s) or device. For example, if the system event is assigned to an action on a user interface, system controller 140 sends a command to the screen controller 150 to take that action on the user interface, for example, changing the display mode of the system. surgical field. [000223] Presence detection process [000224] In yet another aspect, as described above, the position accompanied by at least part of the hand of the surgeon 180B is used to determine whether the hand is present at an executing end of the main manipulator 621. Figure 11 is a diagram of process flow of an aspect of a presence detection process 1100 performed, in one aspect, by hand tracking controller 130 in system 100. Process 1100 is performed separately for each of the surgeon's hands in one aspect. [000225] In the process GET ANGLES OF ARTICULATION 1110, the articulation angles of the main tool manipulator 620 (figure 6B) are measured. The process of GETING ANGLES FROM ARTICULATION 1110 transfers the processing to the process of generating KINEMATICS FORWARD 1111. [000226] Since the lengths of the various links in the main tool manipulator 620 are known and the base position 629 of main tool manipulator 620 is known, geometric relationships are used to generate the main tool handle location 621 in the main workspace coordinate system 680. Thus, the GENERATE KINEMATICS FOR FRONT 1111 process generates the main tool grip pmtm position 621 in the main workspace coordinate system 680 using the angles from process 1110. O process of GENERATING KINEMATICS FORWARD 1111 transfers the processing to the process of MAPPING TO WORLD COORDINATES 1112. [000227] The MAPPING FOR WORLD COORDINATE 1112 process maps the pmtm position in the 680 main workspace coordinate system to the pmtm_wc position in the 670 world coordinate system (figure 6A). Specifically, Pmtm_wc = WCTws * Pmtm where WCTws is a rigid homogeneous transformation four by four, which maps the coordinates in the 680 workspace coordinate system to the coordinates in the 670 world coordinate system. [000228] With the conclusion, the MAPPING FOR WORLD COORDINATES 1112 process transfers the processing to the GENERATING SEPARATION OF THE EXECUTING END TO HAND 1130 process. [000229] Returning to hand location receipt data 1120, the HAND LOCATION RECEIPT DATA process 1120 receives (retrieves) the position and orientation of the index finger (index, indicator) and the position and orientation of the thumb (ppoiegar, Rpoiegar). The position and orientation of the index finger (pindicator, Rindicator) and the position and orientation of the thumb (Ppolegar, Rpoiegar) are based on data from the tracking system. The HAND LOCATION RECEIPT DATA 1120 process transfers processing to the GENERATE HAND POSITION 1121 process. [000230] The GENERATE HAND POSITION 1121 process maps the position and orientation of the index finger (index, indicator) and the position and orientation of the thumb (ppoiegar, Rpoiegar) to a position control and orientation point in the system tracking coordinate as described above and said description is hereby incorporated by reference. The Pmão position is the position of the control point in the tracking coordinates. The process of GENERATE HAND POSITION 1121 transfers the processing to the process of MAPPING TO WORLD COORDINATES 1122. [000231] The use of the control point position in the detection of presence is only illustrative and is not intended to be limiting. In view of the present description, the presence detection can be carried out, for example, using the position of the tip of the index finger and using the position of the tip of the thumb, or using only one of the said positions. The processes described below are equivalent to each of the aforementioned various positions associated with a part of the human hand. [000232] The process of MAPPING FOR WORLD COORDINATES 1122 maps the position of hand in accompanying coordinates to a position of hand_wc in the world coordinate system 670 (Figure 6A). Specifically, Pmão_wc = WCTtc * Pmão where wcTtc is 3 rigid homogeneous transformation four by four, which maps the coordinates in the accompanying coordinate system 650 to coordinates in the world coordinate system 670. [000233] With the conclusion, the MAPPING FOR WORLD COORDINATES 1122 process transfers the processing to the GENERATING SEPARATION OF THE EXECUTING END TO HAND 1130. [000234] The process of GENERATING THE SEPARATION OF THE EXECUTING END FOR HAND 1130 generates a separation distance dsep between position pmtm_wc in the world coordinate system 670 and position pmão_wc in the world coordinate system 670. In one aspect the separation distance dsep is: ^ sep - | Pmtm_wc Phand wc | [000235] With the conclusion, the process of GENERATING THE SEPARATION OF THE EXECUTING HAND TO 1130 transfers the processing to the process of checking SAFE DISTANCE 1131. [000236] The SAFE DISTANCE check process 1131 compares the separation distance dsep to a safe distance threshold. This threshold must be small enough to be conservative and still allow the surgeon to change the grip or manipulate the most distal end of the executing end. If the separation distance dsep is less than the safe distance threshold, the process of checking SAFE DISTANCE 1131 transfers it to the HANDS-ON PRESENCE process 1140. Conversely, if the separation distance dsep is greater than the safe distance of the SAFE DISTANCE threshold, check process 1131 transfer to HAND OFF PRESENCE process 1150. [000237] The HAND CONNECTED PRESENCE 1140 process determines whether system 100 is in teleoperation. If system 100 is in teleoperation, no action is required and teleoperation is allowed to proceed, and so process 1140 transfers to initial process 1100. If system 100 is not in teleoperation, the HANDS ON 1140 process sends an event of hand presence to the INJECT SYSTEM EVENT process which in turn sends a hand presence event to the system controller 140. [000238] The HAND OFF PRESENCE 1150 process determines whether system 100 is in teleoperation. If system 100 is not in teleoperation, no action is required and thus process 1150 transfers to start process 1100. If system 100 is in teleoperation, the HAND OFF PRESENCE 1150 process sends a hand event not present to the process INJECT SYSTEM EVENT which in turn sends a hand presence event to system controller 140. [000239] System controller 140 determines whether a hand presence event or a non-present hand event requires any change to the operating system mode and issues an appropriate command. In one aspect, the system controller 140 allows teleoperation in response to the hand presence event, for example, allows teleoperation, and disables teleoperation in response to the non-present hand event if a teleoperated minimally invasive surgical instrument is attached to the main tool grip. As is known to those skilled in the art, a teleoperated minimally invasive surgical instrument is attachable to and detachable from a main tool handle. [000240] In other respects, a hand presence event and a non-present hand event are used by the system controller 140 in combination with other events to determine whether to allow teleoperation. For example, detecting the presence of the surgeon's head can be combined with detecting the presence of the surgeon's hand or hands to determine whether to allow teleoperation. [000241] Similarly, as described above, a hand presence event and a non-present hand event are used by the system controller 140 to control a user interface screen on a minimally invasive surgical system screen. When the system controller 140 receives a hand event not present, if the user interface is not powered on, the system controller 140 sends a command to the screen controller 150 to turn on the user interface. The display controller 150 performs the user interface module part 155 on the processor 150 required to power the user interface. When system controller 140 receives a hand presence event, if the user interface is on, system controller 140 sends a command to screen controller 150 to shut down the user interface. The display controller 150 performs the user interface module part 155 on the processor 150 required to shut down the user interface. [000242] A hand presence event and hand event not present can be used by system controller 140 in combination with other events to determine whether to display the user interface. Thus, the user interface displays the teleoperation control and control are examples of the control system mode using presence detection and are not intended to be limited to those two specific system control modes. [000243] For example, presence detection can be used to control a visual proxy such as those described more fully below. Also combinations of different modes, for example, teleoperation and visual proxy display proxy, can be controlled by the system controller 140 based on a hand presence event and a non present hand event. [000244] Also, hand presence detection is useful in eliminating the double determination of 621L, 621R main tool handles, for example, by pushing a pedal and then using 621L, 621R main tool handles to control a user interface that is displayed on the 185B surgeon console. When the main tool handles are double determined, for example, used to control not only a surgical instrument but also a user interface, the surgeon typically has to press a pedal to switch to the user interface operating mode. If, for some reason, the surgeon fails to press the pedal, but believes that the system has switched to user interface operating mode, the movement of the main tool handle may result in unwanted movement of the surgical instrument. Presence detection process 1100 is used to avoid this problem and to eliminate double determination of main tool handles. [000245] With the presence detection process 1100, in one example, when the handless event is received by the system controller 140, the system controller 140 sends a system command to lock 620I main instrument handlers, 620R (figure 6A) in place, and sends a system command to the screen controller 150 to present the user interface on the 185B surgeon's console screen. The movement of the surgeon's hand is tracked and is used to control the elements in a user interface, for example, moving a slide switch, changing the screen, etc. As noted above, the control point is mapped within the eye coordinate structure and thus can be associated with the location of an element in the user interface. The movement of the control point is used to manipulate that element. This is done without the surgeon having to activate a pedal and is done so that the surgeon cannot inadvertently move a surgical instrument. This eliminates the associated problems by using the main tool handles to control not only the surgical instrument but also the user interface. [000246] In the example above, the world coordinate structure is an example of a common coordinate structure. The use of the world coordinate structure as the common coordinate structure is illustrative only and is not intended to be limiting. Main finger accompaniment grip [000247] Figure 12 is an illustration of an example of a main finger tracking grip 1270. Main finger tracking grip 1270 is an example of a main finger tracking grip 170, 270. [000248] Main finger accompaniment grip 1270 includes a compressible body 1210 and two finger loops 1220, 1230. The compressible body 1210 has a first end 1213 and a second end 1214. A body section 1215 extends between the first end 1213 and the second end1214. [000249] The compressible body 1210 has an outer outer surface. The outer outer surface includes a first portion 1216 and a second portion 1217. The first portion 1216, for example, an upper portion, extends between the first end 1213 and the second end 1214. The second portion 1217, for example, the bottom portion, extends between the first end 1213 and the second end 1214. The second portion 1217 is opposite and removed from the first portion 1216. [000250] In one aspect, the outer outer surface is a fabric wrapping surface. The tissue is suitable for use in a surgical environment. The fabric wrap contains a compressible foam. The foam is selected to provide resistance to compression, and expansion as the compression is released. In one aspect, several strips were included within the fabric wrap. The foam should also be able to be folded so that the first portion 1216 is positioned between the first and second fingers of the human hand as the tip of the first finger is moved towards the tip of the second finger. [000251] Body section 1215 has a length L between the finger strap 1220 and the finger strap 1230. As explained above, length L is selected to limit the separation between the first finger strap 1220 and the second finger strap. finger 1230 (see figure 2A). [000252] In one aspect, the body section 1215 has a thickness T. As illustrated in figure 2C, thickness T is selected so that when the main finger tracking handle 1270 is configured so that region 1236 in the second portion 1217 of the outer outer surface adjacent to end 1214 and region 1226 in second portion 1217 adjacent to end 1213 are just touching, the second portion 1217 along length L is not in full contact with itself. [000253] The first finger loop 1220 is attached to the compressible body 1210 adjacent to the first end 1213. The loop 1220 extends over the region 1225 of the first portion 1216 of the outer outer surface of the compressible body 1210. With the placement of the first loop of finger 1220 on the first finger of the human hand, region 1225 comes into contact with the first finger, for example, a first part of the first portion 1216 of the outer outer surface comes into contact with the thumb. [000254] In this example, the finger strap 1220 has two ends, a first end of fabric 1221A and a second end of fabric 1221B. End 1221A and end 1221B are ends of a fabric strip that is attached to body 1210. A piece of loop fabric 1222B is attached to an inner surface of end 1221B, and a piece of hook fabric 1222A is attached to a outer end surface 1221 A. An example of hook fabric and loop fabric is a nylon fastening tape consisting of two strips of nylon fabric, one having small barbed threads and the other having a rough surface. The two strips form a strong bond when pressed together. An example of a commercially offered fixation tape is the VELCRO® fixing tape. (VELCRO® is a registered trademark of Velcro Industries B. V.) [000255] The second finger loop 1230 is attached to the compressible body 1210 adjacent to the second end 1214. The loop 1230 extends over the region 1235 of the first portion 1216 of the outer outer surface of the compressible body 1210. With the placement of the second loop of finger 1230 on the second finger of the human hand, region 1235 comes into contact with the second finger, for example, a second part of the first portion 1216 of the outer outer surface comes into contact with the index finger. The second portion 1235 of the first portion is opposite and removed from the first portion 1225 of the first portion. [000256] In this example, the finger loop 1230 also has two ends, a first end of fabric 1231A and a second end of fabric 1231B. The end 1231A and the end 1231B are ends of a fabric strip that is attached to the body 1210. The loop fabric piece 1232B is attached to an inner surface of the end 1231B, and the hook fabric piece 1232A is attached to a outer end surface 1231 A. [000257] A first location reading sensor 1211 is attached to the first finger loop 1220. A second location reading sensor 1212 is attached to the second finger loop 1230. The location reading sensors can be any of the sensor elements described above. In one example, location reading sensors 1211, 1212 are passive electromagnetic sensors. Visual Proxy System [000258] In one aspect, a hand tracking control system is used to control any one of a plurality of visual proxies that can be used by one surgeon to control another surgeon. For example, when surgeon 180 (figure 1A) is being controlled by surgeon 181 using the main finger tracking grip 170, surgeon 181 uses the main finger tracking grip 170 to control a visual proxy of a surgical instrument, while the surgeon 180 uses the main tool handle to control a teleoperated slave surgical instrument. [000259] Alternatively, surgeon 181 can teleoperate, or can control a virtual hand on the screen. In addition, surgeon 181 can demonstrate how to manipulate the main tool grip on the surgeon's console by manipulating a virtual image of main tool grip 621 that is displayed on the screen. Said examples of visual proxies are illustrative only and are not intended to be limiting. [000260] Additionally, the use of the main finger accompaniment grip 170 while not on a surgeon's console is also illustrative and is not intended to be limiting. For example, with the presence detection system described above, a surgeon on a surgeon's console can move the hand from a main tool handle, and then use that hand to control another surgeon as the hand is accompanied. by a hand tracking system. [000261] To facilitate supervision, a visual proxy module (not shown) is processed as part of a vision processing subsystem in one respect. In this aspect, the execution module receives the position and orientation of the control point from the supervisor's hand, and makes the images stereo, which are composed with the images from the endoscopic camera in real time and displayed in any combination of the surgeon's console. 185, on the auxiliary screen, and on the surgeon interface screen on the patient side 187. [000262] When surgeon 181 starts supervision by adopting a predefined action, for example, a hand gesturing position, a visual proxy system handle is activated, for example, the visual proxy module is executed in a processor module. The particular action, for example, the hand gesturing position, used as the predefined action is not essential as long as the system controller 140 (figure 1) is configured to recognize the said action. [000263] In one respect, the visual proxy is a virtual spectral instrument 1311 (figure 13) controlled by the main finger tracking grip 170, while the teleoperated slave surgical instrument 1310 is controlled by one of the surgeon's console main instrument handlers 185. Surgeon 181 sees both instruments 1310 and 1311 on the screen device 187, while surgeon 180 sees both instruments 1310 and 1311 on the stereoscopic screen on the surgeon's console 185. The use of virtual spectral instrument 1311 as a visual proxy is illustrative only and is not intended to be limiting to that particular image. In view of the description, other images can be used for the visual proxy, which facilitates the differentiation between the image representing the visual proxy and the current image of the executing end of the teleoperated slave surgical instrument. [000264] The virtual spectral instrument 1311 appears similar to the current instrument 1310, except that the virtual spectral instrument 1311 is displayed in a mode that clearly distinguishes the virtual spectral instrument 1311 from the current instrument 1310 (for example, a transparent image or translucent like a ghost image, a distinctly colored image, etc.). The control and operation of the 1311 virtual spectral instrument are the same as those described above for a current teleoperated surgical instrument. Thus, surgeon 181 can manipulate the virtual spectral instrument 1311 using the main finger tracking grip 170 to demonstrate the proper use of the teleoperated slave surgical instrument 1310.0 surgeon 180 can mimic the movement of the virtual spectral instrument 1311 with the instrument 1310. [000265] Virtual spectral instruments are described more fully in the commonly assigned US Patent Application Publication No. US 2009/0192523 A1 (filed March 31, 2009; describing "Synthetic Representation of a Surgical Instrument"), which is found incorporated herein by reference in its entirety. See also US patent application No. 12 / 485,503 (filed June 16, 2009, describing "Virtural Measurement Tool for Minimally Invasive Surgery"); US patent application No. 12 / 485,545 (filed June 16, 2009, describing "Virtual Measurement Tool for Minimally Invasive Surgery"); US Patent Application Publication No. US 2009/0036902 A1 (filed August 11, 2008; describing "Interactive user interfaces for Robotic Minimally Invasive Surgical Systems"); US Patent Application Publication No. US 2007/0167702 A1 (filed December 30, 2005; describing "Medical Robotic System Providing Three-Dimensional Telestration"); US Patent Application Publication No. US 2007/0156017 A1 (filed December 30, 2005; describing "Stereo Telestration for Robotic Surgery") and US Patent Application Publication No. US 2010/0164950 A1 (filed on 13 May 2009; describing "Efficient 3-D Telestration for Local Robotic Proctoring"), each of which is incorporated herein by reference in its entirety. [000266] In another aspect, the visual proxy is a pair of virtual hands 1410, 1411 (figure 14) controlled by the main finger tracking grip 170 and a second main finger tracking grip, which is not visible in figure 1. Teleoperated slave surgical instruments 1420, 1421 are controlled by the surgeon's console 185 main instrument handlers. Surgeon 181 sees video image 1400 on screen device 187, and surgeon 180 also sees video image 1400 on the stereoscopic screen on surgeon console 185. The virtual hands 1410, 1411 are displayed in a way that clearly distinguishes them from other objects in the 1400 video image. [000267] The opening and closing of the thumb and index finger of a virtual hand are controlled using the gprehension lock closure parameter, which was described above. The position and orientation of the virtual hand are controlled by the position of the control point and orientation, as described above, which are mapped in an eye coordinate space, also as described above. [000268] Thus, as the surgeon 181 moves the surgeon's right hand in three dimensions, the virtual hand 1411 follows the movement in the video image 1400. The surgeon 181 can roll the virtual hand 1411 to indicate the surgeon 180 to roll the teleoperated slave surgical instrument 1421. The surgeon 181 can move the virtual hand 1410 to a particular location and then use the movement of the thumb and index finger to instruct the surgeon 180 to move the teleoperated slave surgical instrument 1420 to that location and to secure the fabric. When surgeon 180 takes tissue with instrument 1420, surgeon 181 can use virtual hand 1410 to instruct surgeon 180 how to move the tissue. All of this takes place in real time and the virtual hands 1410, 1411 are superimposed on the image of the stereoscopic endoscope. However, visual proxies can also be used in a monoscopic system. [000269] In another aspect, the surgeon 181 changes the screen mode using the hand gesturing position so that the visual proxies are a virtual spectral instrument 1510 and a virtual remote medicine device 1511, which are presented in a video image 1500 (figure 15). The remote medical device 1511 is controlled by the main finger tracking handle 170, while a second main finger tracking handle, which is not visible in figure 1, controls the virtual spectral instrument 1511. [000270] The teleoperated slave surgical instruments 1520, 1521 are controlled by the main instrument handlers of the surgeon's console 185. The surgeon 181 sees the video image 1500 on the screen device 187, and the surgeon 180 also sees the video image 1500 on the stereoscopic screen on the surgeon's console 185. The virtual remote medical device 1511 and the virtual spectral instrument 1411 are displayed in a way that clearly distinguishes them from other objects in the 1500 video image. [000271] To teleoperate with the virtual remote medical device 1511, the surgeon 181 places the thumb and index finger together as if to pick up an imaginary pen or pencil and then moves the right hand with the thumb and index finger in this position to teleoperate in the displayed video image. In video image 1500, surgeon 181 thus positioned the thumb and index finger and produced the 1512 mark to illustrate where the tissue should be cut using the 1521 surgical instrument. After the 1512 mark is performed, the 1810 surgeon separates the thumb and the index finger and move the virtual remote medicine device 1511 to a position shown in video image 1500. [000272] The dialing capability of the virtual remote medicine device 1511 is controlled using the gprehension lock closure parameter, which was described above. As noted above, when the thumb and index finger are just touching, the gprehension closure parameter is mapped to an initial value in a second range, and so when the gprehension closure parameter is in the second range, the remote medicine is enabled for the remote medicine device 1511. The position and orientation control point after being mapped to the eye coordinate system is used to control the movement of the virtual remote medicine device 1511. [000273] The above description and the accompanying drawings that illustrate the aspects and modalities of the present invention should not be taken as limiting - the claims define the protected inventions. Various mechanical, composition, structural, electrical, and operational changes can be produced without departing from the spirit and scope of the present description and claims. In some examples, well-known circuits, structures, and techniques have not been shown or described in detail to avoid obscuring the present invention. [000274] In addition, the description terminology is not intended to limit the present invention. For example, spatially relative terms - such as "below", "below", "lower", "above", "upper", "proximal", "distal", and the like - can be used to describe an element relationship or feature with another element or feature as illustrated in the figures. Said spatially relative terms are intended to encompass different positions (that is, locations) and orientations (that is, rotational arrangements) of the device in use or operation in addition to the position and orientation shown in the figures. For example, if the device in the figures is rotated, the elements described as "below" or "below" other elements or characteristics will then be "above" or "on" the other elements or characteristics. Thus, the exemplificative term "below" can encompass both the above and below positions and orientations. The device can be otherwise oriented (rotated by 90 degrees or in other orientations) and the spatially relative elements of description used herein interpreted accordingly. Likewise, the descriptions of movement along and around the various axes include the different positions and orientations of the special device. [000275] The singular forms "one", "one", and "o", "a" are intended to include plural forms as well, unless the context indicates otherwise. The terms "comprises", "comprising", "includes", and the like specify the presence of certain characteristics, steps, operations, process elements and / or components, but do not prevent the presence or addition of one or more other characteristics, steps, operations, process elements, components, and / or groups. The components described as coupled can be electrically or mechanically directly coupled, or they can be indirectly coupled by means of one or more intermediate components. [000276] Memory refers to a volatile memory, a non-volatile memory, or any combination of the two. A processor is coupled to a memory containing instructions executed by the processor. This can be done within a computer system, or alternatively through a connection to another computer using modems and analog lines, or digital interfaces and a digital vehicle line. [000277] Here, a computer program product includes a configured means to store readable code necessary for any or any combination of the processes described with respect to hand tracking or where the code capable of being read by computer for any or any combination of processes described with respect to hand tracking is stored. Some examples of computer program products are CD-ROM discs, DVD discs, flash memory, ROM cards, floppy disks, magnetic tapes, computer hard drives, servers on a network and signals transmitted over a network that represent program code capable of be read by computer. A non-transitory tangible computer program product includes a non-transitory tangible medium configured to store instructions capable of being read by a computer for any of, or any combination of processes described with respect to the various controllers or in which instructions capable of being read by computer for any of, or any combination of processes described with respect to the various controllers are stored. Non-transitory tangible computer program products are CD-ROM discs, DVD discs, flash memory, ROM cards, floppy disks, magnetic tapes, computer hard drives and other non-transitory physical storage media. [000278] In view of the present description, the instructions used in any of, or any combination of processes described with respect to hand tracking can be implemented in a wide variety of computing system configurations using an operating system and program language computer of interest to the user. [000279] The use of different memories and processors in figure 1 is only illustrative and is not intended to be limiting. In some ways, a single hardware processor can be used and in other ways, multiple processors can be used. [000280] Still, for each illustration, the different processes were distributed between a hand tracking controller and a system controller. This is also illustrative and is not intended to be limiting. The various processes can be distributed through controllers or consolidated into one controller without changing the principles of operation in the hand tracking process. [000281] All examples and illustrative references are non-limiting and should not be used to limit claims to the specific implementations and modalities described here and their equivalents. Headings are for formatting only and should not be used to limit the subject in any way, because the text under a heading can refer to or apply to the text under one or more headings. Finally, in view of the present description, particular features described in relation to an aspect or embodiment can be applied to other aspects or embodiments described in the present invention, although not specifically shown in the drawings or described in the text.
权利要求:
Claims (14) [0001] 1. Minimally invasive surgical system (100) characterized by the fact that it comprises: a hand tracking system (186), in which the hand tracking system (186) is configured to track locations of a plurality of tracking sensors ( 1211, 1212) mounted on the parts of a human hand; and a controller (130) coupled to the hand tracking system (186), wherein the controller (130) is configured to: select a hand gesture from a plurality of known hand gestures based on the plurality of locations; determine whether the hand gesture selected is a hand gesture pose (300A-300D) associated with a hand gesture gesture (400A-400B); and in response to the determination that the selected hand gesticulation is the hand gesticulation pose (300A-300D) associated with the hand gesticulation path (400A-400B), select a hand gesticulation path from a plurality of known hand gesticulation trajectories (400A-400B); and control the operation of a minimally invasive surgical system (100) based on the gesticulation trajectory of the selected hand. [0002] 2. Minimally invasive surgical system, according to claim 1, characterized by the fact that, to control the operation of the minimally invasive surgical system (100), the controller (130) is further configured to control a user interface based on hand gesticulation trajectory (400A, 400B). [0003] 3. Minimally invasive surgical system, according to claim 1, characterized by the fact that to control the operation of the minimally invasive surgical system (100), the control is further configured to: use the accompaniment to control a visual proxy, in which the visual proxy is selected from a group of visual proxies including a virtual hand (1410,1411), a virtual surgical instrument, and a virtual remote medicine device (1511). [0004] 4. Minimally invasive surgical system, according to claim 1, characterized by the fact that, to select the hand gesticulation path, the controller is also configured to: generate a speed sequence from a path of the plurality of locations ; and convert the speed sequence into a sequence of symbols. [0005] 5. Minimally invasive surgical system, according to claim 4, characterized by the fact that, to select the hand gesticulation trajectory, the controller is further configured to: analyze the sequence of symbols with a plurality of Markov Hidden Models in that each of the plurality of known hand gesticulation trajectories (400A, 400B) has a Hidden Markov Model in the plurality of Hidden Markov Models. [0006] 6. Minimally invasive surgical system, according to claim 1, characterized by the fact that the controller is further configured to: map the gesticulation trajectory of the selected hand to a system command. [0007] 7. Minimally invasive surgical system, according to claim 6, characterized by the fact that, to control the minimally invasive surgical system (100), the controller is configured to inject the system command into a minimally invasive surgical system (100). [0008] 8. Minimally invasive surgical system according to claim 1, characterized in that the hand gesticulation is a hand gesticulation position (300A-300D) and the plurality of known hand gesticulations includes a plurality of gesticulation positions known hand (300A-300D). [0009] 9. Minimally invasive surgical system, according to claim 2, characterized by the fact that, to control the operation of the minimally invasive surgical system (100), the controller (130) is further configured to use tracking to control a visual proxy . [0010] 10. Minimally invasive surgical system, according to claim 9, characterized by the fact that the visual proxy is selected from a group of visual proxies including a virtual hand (1410, 1411), a virtual surgical instrument, and a device of virtual remote medicine (1511). [0011] 11. Minimally invasive surgical system, according to claim 1, characterized by the fact that to select the hand gesticulation, the controller (130) is configured to generate a set of characteristics observed from the plurality of locations. [0012] 12. Minimally invasive surgical system, according to claim 11, characterized by the fact that to select the hand gesticulation, the controller (130) is further configured to: compare the set of observed characteristics with the sets of characteristics of the plurality of known hand gesticulation positions (300A-300D). [0013] 13. Minimally invasive surgical system, according to claim 12, characterized by the fact that to select hand gesticulation, the controller (130) is further configured to: select a hand gesticulation position (300A- 300D) from the plurality of known hand gesturing positions (300A-300D) based on the comparison. [0014] 14. Minimally invasive surgical system, according to claim 5, characterized by the fact that to select the hand gesticulation, the controller (130) is further configured to select the hand gesticulation path based on the analysis
类似技术:
公开号 | 公开日 | 专利标题 BR112012011422B1|2020-09-29|MINIMUM INVASIVE SURGICAL SYSTEM US20200113641A1|2020-04-16|Method and system for hand tracking in a robotic system JP6000387B2|2016-09-28|Master finger tracking system for use in minimally invasive surgical systems US9901402B2|2018-02-27|Method and apparatus for hand gesture control in a minimally invasive surgical system BR112012011321B1|2020-10-13|method and system for manual control of a minimally invasive teleoperated auxiliary surgical instrument EP3092968B1|2019-07-24|System for hand presence detection in a minimally invasive surgical system BR112012011323B1|2020-02-04|minimally invasive surgical system
同族专利:
公开号 | 公开日 BR112012011422A2|2016-05-03| WO2012044334A2|2012-04-05| CN102647955A|2012-08-22| EP3320875A1|2018-05-16| CN106725860A|2017-05-31| CN102647955B|2017-02-08| JP2013510673A|2013-03-28| KR20130027006A|2013-03-14| CN106725860B|2019-06-21| KR101762638B1|2017-07-28| WO2012044334A3|2012-05-31| JP5702798B2|2015-04-15| EP2498711B1|2018-01-10| EP2498711A2|2012-09-19|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US6110130A|1997-04-21|2000-08-29|Virtual Technologies, Inc.|Exoskeleton device for directly measuring fingertip position and inferring finger joint angle| US7472047B2|1997-05-12|2008-12-30|Immersion Corporation|System and method for constraining a graphical hand from penetrating simulated graphical objects| US6331181B1|1998-12-08|2001-12-18|Intuitive Surgical, Inc.|Surgical robotic tools, data architecture, and use| US6799065B1|1998-12-08|2004-09-28|Intuitive Surgical, Inc.|Image shifting apparatus and method for a telerobotic system| US6424885B1|1999-04-07|2002-07-23|Intuitive Surgical, Inc.|Camera referenced control in a minimally invasive surgical apparatus| US6809462B2|2000-04-05|2004-10-26|Sri International|Electroactive polymer sensors| EP1550024A2|2002-06-21|2005-07-06|Cedara Software Corp.|Computer assisted system and method for minimal invasive hip, uni knee and total knee replacement| US8398541B2|2006-06-06|2013-03-19|Intuitive Surgical Operations, Inc.|Interactive user interfaces for robotic minimally invasive surgical systems| US20070167702A1|2005-12-30|2007-07-19|Intuitive Surgical Inc.|Medical robotic system providing three-dimensional telestration| US7907166B2|2005-12-30|2011-03-15|Intuitive Surgical Operations, Inc.|Stereo telestration for robotic surgery| US20090192523A1|2006-06-29|2009-07-30|Intuitive Surgical, Inc.|Synthetic representation of a surgical instrument| WO2008042220A2|2006-09-29|2008-04-10|Nellcor Puritan Bennett Llc|User interface and identification in a medical device system and method| US20080275367A1|2007-04-23|2008-11-06|Hansen Medical, Inc|Systems and methods for mapping intra-body tissue compliance| US20090012533A1|2007-04-23|2009-01-08|Hansen Medical, Inc.|Robotic instrument control system| US20090138025A1|2007-05-04|2009-05-28|Hansen Medical, Inc.|Apparatus systems and methods for forming a working platform of a robotic instrument system by manipulation of components having controllably rigidity| US8830224B2|2008-12-31|2014-09-09|Intuitive Surgical Operations, Inc.|Efficient 3-D telestration for local robotic proctoring|JP6323974B2|2012-05-18|2018-05-16|オリンパス株式会社|Surgery support device| JP6053358B2|2012-07-03|2016-12-27|オリンパス株式会社|Surgery support device| DE102012218213B4|2012-10-05|2016-03-03|Deutsches Zentrum für Luft- und Raumfahrt e.V.|Method for controlling a telemanipulation robot| KR20140052640A|2012-10-25|2014-05-07|삼성전자주식회사|Method for displaying a cursor on a display and system performing the same| CN104252225A|2013-06-27|2014-12-31|南京迈瑞生物医疗电子有限公司|Motion control medical equipment and medical equipment motion control method| WO2015031777A1|2013-08-29|2015-03-05|Wayne State University|Camera control system and method| DE102013110847B3|2013-10-01|2015-01-22|gomtec GmbH|Control device and method for controlling a robot system by means of gesture control| US10314463B2|2014-10-24|2019-06-11|Auris Health, Inc.|Automated endoscope calibration| US9535505B2|2013-11-08|2017-01-03|Polar Electro Oy|User interface control in portable system| EP3243476B1|2014-03-24|2019-11-06|Auris Health, Inc.|Systems and devices for catheter driving instinctiveness| CN106232048B|2014-04-24|2019-05-03|柯惠Lp公司|Robot interface's positioning determining system and method| EP3139843A4|2014-05-05|2018-05-30|Vicarious Surgical Inc.|Virtual reality surgical device| CN110074863B|2014-07-25|2021-11-02|柯惠Lp公司|Augmented surgical reality environment for robotic surgical system| CN107427327A|2014-09-30|2017-12-01|奥瑞斯外科手术机器人公司|Configurable robotic surgical system with virtual track and soft endoscope| CN105538307B|2014-11-04|2018-08-07|宁波弘讯科技股份有限公司|Control device, system and method| CN110794960A|2014-12-08|2020-02-14|罗希特·塞思|Wearable wireless HMI device| WO2016133644A1|2015-02-20|2016-08-25|Covidien Lp|Operating room and surgical site awareness| US10600015B2|2015-06-24|2020-03-24|Karl Storz Se & Co. Kg|Context-aware user interface for integrated operating room| CN104970754B|2015-06-25|2016-09-28|云南电网有限责任公司电力科学研究院|A kind of method controlling endoscope's optical fiber based on Kinect sensor gesture| CN105125155A|2015-06-25|2015-12-09|云南电网有限责任公司电力科学研究院|Gesture-based method for controlling endoscope optical fiber| EP3342553A4|2015-08-25|2019-08-07|Kawasaki Jukogyo Kabushiki Kaisha|Information sharing system and information sharing method for sharing information between multiple robot systems| US20190290374A1|2016-06-03|2019-09-26|Covidien Lp|Systems, methods, and computer-readable program products for controlling a robotically delivered manipulator| CN107689186A|2016-08-03|2018-02-13|天津慧医谷科技有限公司|3 D human body virtual anatomic platform| US9931025B1|2016-09-30|2018-04-03|Auris Surgical Robotics, Inc.|Automated calibration of endoscopes with pull wires| JP6974463B2|2016-12-09|2021-12-01|バーブ サージカル インコーポレイテッドVerb Surgical Inc.|User interface device for use in robotic surgery| US10459519B2|2017-01-19|2019-10-29|Google Llc|Function allocation for virtual controller| EP3579736A4|2017-02-09|2020-12-23|Vicarious Surgical Inc.|Virtual reality surgical tools system| US20200261160A1|2017-09-05|2020-08-20|Covidien Lp|Robotic surgical systems and methods and computer-readable media for controlling them| EP3700454A1|2017-10-26|2020-09-02|Guido Danieli|Robotic system for angioplasty and endoluminar surgery| AU2018380139A1|2017-12-06|2020-05-21|Auris Health, Inc.|Systems and methods to correct for uncommanded instrument roll| CN110025378A|2018-01-12|2019-07-19|中国科学院沈阳自动化研究所|A kind of operation auxiliary navigation method based on optical alignment method| KR20200122337A|2018-02-13|2020-10-27|아우리스 헬스, 인코포레이티드|Systems and methods for driving medical devices| DE102018104714A1|2018-03-01|2019-09-05|Karl Storz Se & Co. Kg|Telemanipulator system and method for operating a telemanipulator system| CN111012513B|2018-10-09|2021-07-27|成都博恩思医学机器人有限公司|Surgical instrument control method of laparoscopic surgery robot| IT201800009380A1|2018-10-11|2020-04-11|Guido Danieli|Robotic System for Angioplasty and Endoluminary Surgery| CN111882941A|2020-08-11|2020-11-03|中南大学湘雅二医院|Heart great vessel anatomy teaching system and method based on virtual reality|
法律状态:
2018-03-27| B15K| Others concerning applications: alteration of classification|Ipc: A61B 34/30 (2016.01), A61B 34/37 (2016.01), A61B 9 | 2019-01-08| B06F| Objections, documents and/or translations needed after an examination request according art. 34 industrial property law| 2019-07-16| B06T| Formal requirements before examination| 2019-11-26| B07A| Technical examination (opinion): publication of technical examination (opinion)| 2020-04-14| B07A| Technical examination (opinion): publication of technical examination (opinion)| 2020-07-28| B09A| Decision: intention to grant| 2020-09-29| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 11/11/2010, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US12/617,937|2009-11-13| US12/617,937|US8521331B2|2009-11-13|2009-11-13|Patient-side surgeon interface for a minimally invasive, teleoperated surgical instrument| US12/886,993|US8996173B2|2010-09-21|2010-09-21|Method and apparatus for hand gesture control in a minimally invasive surgical system| US12/886,993|2010-09-21| PCT/US2010/056394|WO2012044334A2|2009-11-13|2010-11-11|Method and apparatus for hand gesture control in a minimally invasive surgical system| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|